Teaching Evidence-Based Dermatology Using a Web-Based Journal Club: A Pilot Study and Survey

Article Type
Changed
Thu, 02/10/2022 - 10:36
Display Headline
Teaching Evidence-Based Dermatology Using a Web-Based Journal Club: A Pilot Study and Survey

To the Editor:

With a steady increase in dermatology publications over recent decades, there is an expanding pool of evidence to address clinical questions.1 Residency training is the time when appraising the medical literature and practicing evidence-based medicine is most honed. Evidence-based medicine is an essential component of Practice-based Learning and Improvement, a required core competency of the Accreditation Council for Graduate Medical Education.2 Assimilation of new research evidence is traditionally taught through didactics and journal club discussions in residency.

However, at a time when the demand for information overwhelms safeguards that exist to evaluate its quality, it is more important than ever to be equipped with the proper tools to critically appraise novel literature. Beyond accepting a scientific article at face value, physicians must learn to ask targeted questions of the study design, results, and clinical relevance. These questions change based on the type of study, and organizations such as the Oxford Centre for Evidence-Based Medicine provide guidance through critical appraisal worksheets.3

To investigate the utility of using guided questions to evaluate the reliability, significance, and applicability of clinical evidence, we beta tested a novel web-based application in an academic dermatology setting to design and run a journal club for residents. Six dermatology residents participated in this institutional review board–approved study comprised of 3 phases: (1) independent article appraisal through the web-based application, (2) group discussion, and (3) anonymous postsurvey.

Using this platform, we uploaded a recent article into the interactive reader, which contained an integrated tool for appraisal based on specific questions. Because the article described the results of a randomized clinical trial, we used questions from the Centre for Evidence-Based Medicine’s Randomised Controlled Trials Critical Appraisal Worksheet, which has a series of questions to evaluate internal validity, results, and external validity and applicability.3

Residents used the platform to independently read the article, highlight areas of the text that corresponded to 8 critical appraisal questions, and answer yes or no to these questions. Based on residents’ answers, a final appraisal score (on a scale of 1% to 100%) was generated. Simultaneously, the attending dermatologist leading the journal club (C.W.) also completed the assignment to establish an expert score.

Scores from the residents’ independent appraisal ranged from 75% to 100% (mean, 85.4%). Upon discussing the article in a group setting, the residents established a consensus score of 75%. This consensus score matched the expert score, which suggested to us that both independently reviewing the article using guided questions and conducting a group debriefing were necessary to match the expert level of critical appraisal.

Of note, the residents’ average independent appraisal score was higher than both the consensus and expert scores, indicating that the residents evaluated the article less critically on their own. With more practice using this method, it is possible that the precision and accuracy of the residents’ critical appraisal of scientific articles will improve.

 

 

In the postsurvey, we asked residents about the critical appraisal of the medical literature. All residents agreed that evaluating the quality of evidence when reading a scientific article was somewhat important or very important to them; however, only 2 of 6 evaluated the quality of evidence all the time, and the other 4 did so half of the time or less than half of the time.

When critically appraising articles, 2 of 6 residents used specific rubrics half of the time; 4 of 6 less than half of the time. Most important, 5 of 6 residents agreed that the quality of evidence affected their management decisions more than half of the time or all of the time. Although it is clear that residents value evidence-based medicine and understand the importance of evaluating the quality of evidence, doing so currently might not be simple or practical.

An organized framework for appraising articles would streamline the process. Five of 6 residents agreed that the use of specific questions as a guide made it easier to appraise an article for the quality of its evidence. Four of 6 residents found that juxtaposing specific questions with the interactive reader was helpful; 5 of 6 agreed that they would use a web-based journal club platform if given the option.

Lastly, 5 of 6 residents agreed that if such a tool were available, a platform containing all major dermatology publications in an interactive reader format, along with relevant appraisal questions on the side, would be useful.

This pilot study augmented the typical journal club experience by emphasizing goal-directed reading and the importance of analyzing the quality of evidence. The combination of independent appraisal of an article using targeted questions and a group debrief led to better understanding of the evidence and its clinical applicability. The COVID-19 pandemic may be a better time than ever to explore innovative ways to teach evidence-based medicine in residency training.

References
  1. Mimouni D, Pavlovsky L, Akerman L, et al. Trends in dermatology publications over the past 15 years. Am J Clin Dermatol. 2010;11:55-58. doi:10.2165/11530190-000000000-00000.
  2. NEJM Knowledge+ Team. Exploring the ACGME Core Competencies: Practice-Based Learning and Improvement (part 2 of 7). Massachusetts Medical Society. NEJM Knowledge+ website. Published July 28, 2016. Accessed January 15, 2022. https://knowledgeplus.nejm.org/blog/practice-based-learning-and-improvement/
  3. University of Oxford. Critical appraisal tools. Centre for Evidence-Based Medicine website. Accessed January 2, 2022. www.cebm.ox.ac.uk/resources/ebm-tools/critical-appraisal-tools
Article PDF
Author and Disclosure Information

Drs. Chuchvara, Wassef, and Rao are from the Center for Dermatology, Rutgers Robert Wood Johnson Medical School, Somerset, New Jersey. Dr. Rao also is from the Department of Dermatology, Weill Cornell Medicine, New York, New York.

Drs. Chuchvara, Wassef, and Rao report no conflict of interest. Dr. Hasan is the founder/owner of MD Access LLC, which owns JournalClub.net. Dr. Hasan also is the co-founder/co-owner of RH Nanopharmaceuticals, LLC, and is a recipient of and co-investigator for National Institutes of Health grant #4R44NS113749-02 for drug development research under RH Nanopharmaceuticals, LLC.

Correspondence: Nadiya O. Chuchvara, MD, 1 Worlds Fair Dr, 2nd Floor, Ste 2400, Somerset, NJ 08873 (nadiyac94@gmail.com).

Issue
Cutis - 109(2)
Publications
Topics
Page Number
88-89
Sections
Author and Disclosure Information

Drs. Chuchvara, Wassef, and Rao are from the Center for Dermatology, Rutgers Robert Wood Johnson Medical School, Somerset, New Jersey. Dr. Rao also is from the Department of Dermatology, Weill Cornell Medicine, New York, New York.

Drs. Chuchvara, Wassef, and Rao report no conflict of interest. Dr. Hasan is the founder/owner of MD Access LLC, which owns JournalClub.net. Dr. Hasan also is the co-founder/co-owner of RH Nanopharmaceuticals, LLC, and is a recipient of and co-investigator for National Institutes of Health grant #4R44NS113749-02 for drug development research under RH Nanopharmaceuticals, LLC.

Correspondence: Nadiya O. Chuchvara, MD, 1 Worlds Fair Dr, 2nd Floor, Ste 2400, Somerset, NJ 08873 (nadiyac94@gmail.com).

Author and Disclosure Information

Drs. Chuchvara, Wassef, and Rao are from the Center for Dermatology, Rutgers Robert Wood Johnson Medical School, Somerset, New Jersey. Dr. Rao also is from the Department of Dermatology, Weill Cornell Medicine, New York, New York.

Drs. Chuchvara, Wassef, and Rao report no conflict of interest. Dr. Hasan is the founder/owner of MD Access LLC, which owns JournalClub.net. Dr. Hasan also is the co-founder/co-owner of RH Nanopharmaceuticals, LLC, and is a recipient of and co-investigator for National Institutes of Health grant #4R44NS113749-02 for drug development research under RH Nanopharmaceuticals, LLC.

Correspondence: Nadiya O. Chuchvara, MD, 1 Worlds Fair Dr, 2nd Floor, Ste 2400, Somerset, NJ 08873 (nadiyac94@gmail.com).

Article PDF
Article PDF

To the Editor:

With a steady increase in dermatology publications over recent decades, there is an expanding pool of evidence to address clinical questions.1 Residency training is the time when appraising the medical literature and practicing evidence-based medicine is most honed. Evidence-based medicine is an essential component of Practice-based Learning and Improvement, a required core competency of the Accreditation Council for Graduate Medical Education.2 Assimilation of new research evidence is traditionally taught through didactics and journal club discussions in residency.

However, at a time when the demand for information overwhelms safeguards that exist to evaluate its quality, it is more important than ever to be equipped with the proper tools to critically appraise novel literature. Beyond accepting a scientific article at face value, physicians must learn to ask targeted questions of the study design, results, and clinical relevance. These questions change based on the type of study, and organizations such as the Oxford Centre for Evidence-Based Medicine provide guidance through critical appraisal worksheets.3

To investigate the utility of using guided questions to evaluate the reliability, significance, and applicability of clinical evidence, we beta tested a novel web-based application in an academic dermatology setting to design and run a journal club for residents. Six dermatology residents participated in this institutional review board–approved study comprised of 3 phases: (1) independent article appraisal through the web-based application, (2) group discussion, and (3) anonymous postsurvey.

Using this platform, we uploaded a recent article into the interactive reader, which contained an integrated tool for appraisal based on specific questions. Because the article described the results of a randomized clinical trial, we used questions from the Centre for Evidence-Based Medicine’s Randomised Controlled Trials Critical Appraisal Worksheet, which has a series of questions to evaluate internal validity, results, and external validity and applicability.3

Residents used the platform to independently read the article, highlight areas of the text that corresponded to 8 critical appraisal questions, and answer yes or no to these questions. Based on residents’ answers, a final appraisal score (on a scale of 1% to 100%) was generated. Simultaneously, the attending dermatologist leading the journal club (C.W.) also completed the assignment to establish an expert score.

Scores from the residents’ independent appraisal ranged from 75% to 100% (mean, 85.4%). Upon discussing the article in a group setting, the residents established a consensus score of 75%. This consensus score matched the expert score, which suggested to us that both independently reviewing the article using guided questions and conducting a group debriefing were necessary to match the expert level of critical appraisal.

Of note, the residents’ average independent appraisal score was higher than both the consensus and expert scores, indicating that the residents evaluated the article less critically on their own. With more practice using this method, it is possible that the precision and accuracy of the residents’ critical appraisal of scientific articles will improve.

 

 

In the postsurvey, we asked residents about the critical appraisal of the medical literature. All residents agreed that evaluating the quality of evidence when reading a scientific article was somewhat important or very important to them; however, only 2 of 6 evaluated the quality of evidence all the time, and the other 4 did so half of the time or less than half of the time.

When critically appraising articles, 2 of 6 residents used specific rubrics half of the time; 4 of 6 less than half of the time. Most important, 5 of 6 residents agreed that the quality of evidence affected their management decisions more than half of the time or all of the time. Although it is clear that residents value evidence-based medicine and understand the importance of evaluating the quality of evidence, doing so currently might not be simple or practical.

An organized framework for appraising articles would streamline the process. Five of 6 residents agreed that the use of specific questions as a guide made it easier to appraise an article for the quality of its evidence. Four of 6 residents found that juxtaposing specific questions with the interactive reader was helpful; 5 of 6 agreed that they would use a web-based journal club platform if given the option.

Lastly, 5 of 6 residents agreed that if such a tool were available, a platform containing all major dermatology publications in an interactive reader format, along with relevant appraisal questions on the side, would be useful.

This pilot study augmented the typical journal club experience by emphasizing goal-directed reading and the importance of analyzing the quality of evidence. The combination of independent appraisal of an article using targeted questions and a group debrief led to better understanding of the evidence and its clinical applicability. The COVID-19 pandemic may be a better time than ever to explore innovative ways to teach evidence-based medicine in residency training.

To the Editor:

With a steady increase in dermatology publications over recent decades, there is an expanding pool of evidence to address clinical questions.1 Residency training is the time when appraising the medical literature and practicing evidence-based medicine is most honed. Evidence-based medicine is an essential component of Practice-based Learning and Improvement, a required core competency of the Accreditation Council for Graduate Medical Education.2 Assimilation of new research evidence is traditionally taught through didactics and journal club discussions in residency.

However, at a time when the demand for information overwhelms safeguards that exist to evaluate its quality, it is more important than ever to be equipped with the proper tools to critically appraise novel literature. Beyond accepting a scientific article at face value, physicians must learn to ask targeted questions of the study design, results, and clinical relevance. These questions change based on the type of study, and organizations such as the Oxford Centre for Evidence-Based Medicine provide guidance through critical appraisal worksheets.3

To investigate the utility of using guided questions to evaluate the reliability, significance, and applicability of clinical evidence, we beta tested a novel web-based application in an academic dermatology setting to design and run a journal club for residents. Six dermatology residents participated in this institutional review board–approved study comprised of 3 phases: (1) independent article appraisal through the web-based application, (2) group discussion, and (3) anonymous postsurvey.

Using this platform, we uploaded a recent article into the interactive reader, which contained an integrated tool for appraisal based on specific questions. Because the article described the results of a randomized clinical trial, we used questions from the Centre for Evidence-Based Medicine’s Randomised Controlled Trials Critical Appraisal Worksheet, which has a series of questions to evaluate internal validity, results, and external validity and applicability.3

Residents used the platform to independently read the article, highlight areas of the text that corresponded to 8 critical appraisal questions, and answer yes or no to these questions. Based on residents’ answers, a final appraisal score (on a scale of 1% to 100%) was generated. Simultaneously, the attending dermatologist leading the journal club (C.W.) also completed the assignment to establish an expert score.

Scores from the residents’ independent appraisal ranged from 75% to 100% (mean, 85.4%). Upon discussing the article in a group setting, the residents established a consensus score of 75%. This consensus score matched the expert score, which suggested to us that both independently reviewing the article using guided questions and conducting a group debriefing were necessary to match the expert level of critical appraisal.

Of note, the residents’ average independent appraisal score was higher than both the consensus and expert scores, indicating that the residents evaluated the article less critically on their own. With more practice using this method, it is possible that the precision and accuracy of the residents’ critical appraisal of scientific articles will improve.

 

 

In the postsurvey, we asked residents about the critical appraisal of the medical literature. All residents agreed that evaluating the quality of evidence when reading a scientific article was somewhat important or very important to them; however, only 2 of 6 evaluated the quality of evidence all the time, and the other 4 did so half of the time or less than half of the time.

When critically appraising articles, 2 of 6 residents used specific rubrics half of the time; 4 of 6 less than half of the time. Most important, 5 of 6 residents agreed that the quality of evidence affected their management decisions more than half of the time or all of the time. Although it is clear that residents value evidence-based medicine and understand the importance of evaluating the quality of evidence, doing so currently might not be simple or practical.

An organized framework for appraising articles would streamline the process. Five of 6 residents agreed that the use of specific questions as a guide made it easier to appraise an article for the quality of its evidence. Four of 6 residents found that juxtaposing specific questions with the interactive reader was helpful; 5 of 6 agreed that they would use a web-based journal club platform if given the option.

Lastly, 5 of 6 residents agreed that if such a tool were available, a platform containing all major dermatology publications in an interactive reader format, along with relevant appraisal questions on the side, would be useful.

This pilot study augmented the typical journal club experience by emphasizing goal-directed reading and the importance of analyzing the quality of evidence. The combination of independent appraisal of an article using targeted questions and a group debrief led to better understanding of the evidence and its clinical applicability. The COVID-19 pandemic may be a better time than ever to explore innovative ways to teach evidence-based medicine in residency training.

References
  1. Mimouni D, Pavlovsky L, Akerman L, et al. Trends in dermatology publications over the past 15 years. Am J Clin Dermatol. 2010;11:55-58. doi:10.2165/11530190-000000000-00000.
  2. NEJM Knowledge+ Team. Exploring the ACGME Core Competencies: Practice-Based Learning and Improvement (part 2 of 7). Massachusetts Medical Society. NEJM Knowledge+ website. Published July 28, 2016. Accessed January 15, 2022. https://knowledgeplus.nejm.org/blog/practice-based-learning-and-improvement/
  3. University of Oxford. Critical appraisal tools. Centre for Evidence-Based Medicine website. Accessed January 2, 2022. www.cebm.ox.ac.uk/resources/ebm-tools/critical-appraisal-tools
References
  1. Mimouni D, Pavlovsky L, Akerman L, et al. Trends in dermatology publications over the past 15 years. Am J Clin Dermatol. 2010;11:55-58. doi:10.2165/11530190-000000000-00000.
  2. NEJM Knowledge+ Team. Exploring the ACGME Core Competencies: Practice-Based Learning and Improvement (part 2 of 7). Massachusetts Medical Society. NEJM Knowledge+ website. Published July 28, 2016. Accessed January 15, 2022. https://knowledgeplus.nejm.org/blog/practice-based-learning-and-improvement/
  3. University of Oxford. Critical appraisal tools. Centre for Evidence-Based Medicine website. Accessed January 2, 2022. www.cebm.ox.ac.uk/resources/ebm-tools/critical-appraisal-tools
Issue
Cutis - 109(2)
Issue
Cutis - 109(2)
Page Number
88-89
Page Number
88-89
Publications
Publications
Topics
Article Type
Display Headline
Teaching Evidence-Based Dermatology Using a Web-Based Journal Club: A Pilot Study and Survey
Display Headline
Teaching Evidence-Based Dermatology Using a Web-Based Journal Club: A Pilot Study and Survey
Sections
Inside the Article

Practice Points

  • A novel web-based application was beta tested in an academic dermatology setting to design and run a journal club for residents.
  • Goal-directed reading was emphasized by using guided questions to critically appraise literature based on reliability, significance, and applicability.
  • The combination of independent appraisal of an article using targeted questions and a group debrief led to better understanding of the evidence and its clinical applicability.
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media