Article Type
Changed
Mon, 08/12/2024 - 15:17

When it comes to using artificial intelligence (AI) in your practice, pediatric dermatologist Albert Yan, MD, professor of pediatrics and dermatology at the University of Pennsylvania, Philadelphia, suggests that dermatologists “just jump in” and become familiar with the various AI models.

He reminds doctors that many of their colleagues and patients and their families are already using these systems, “and you don’t want to be left behind.”

In an interview following his presentation on AI at the annual meeting of the Society for Pediatric Dermatology (SPD), Dr. Yan discussed his tips for using AI.
 

Changing Fast 

From the outset, most generative AI systems have been very good at processing language — for example, generating letters of medical necessity and summarizing disease processes into lay terms. But now they’re becoming “truly multimodal,” said Dr. Yan. “You can enter images; you could have it process audio; you can even start to have it refine video.”

To get started, he recommends signing up for a free account with ChatGPT, Gemini, Perplexity, Claude, and/or Microsoft Copilot. “To make the best choice, you have to try them out yourself because they each have their own kind of flavor and strengths and weaknesses,” said Dr. Yan.

Personally, he finds that ChatGPT is the most versatile, Gemini perhaps a little better in terms of image generation, and Perplexity probably the best at references because it was designed as an online library.



Once you figure out which platforms you prefer, consider signing up for a premium subscription, which is typically month to month and can be canceled at any time, Dr. Yan said. “This will allow you to get the most out of the AI model.”

As these AI systems are based on large language models, they are excellent at text, Dr. Yan noted. He suggests asking one to generate a letter or patient instruction sheet. “If you have a premium model, give it a PDF to summarize an article or take a photo of something that you want its opinion on.”

Privacy Critical

Always pay attention to privacy issues and avoid entering any private health information that would violate the Health Insurance Portability and Accountability Act (HIPAA), he said.

“We have to be very careful about how we interact with AI,” said Dr. Yan. “We can’t be posting private patient health information into these systems, no matter how useful these systems are.” Many academic institutions are creating “walled gardens” — private areas of AI access that don’t allow patient information to “leak out,” he said. “These AI models may have HIPAA protections in place and come with specific guidelines of use.”

The AI “scribe,” which helps with electronic health record documentation, is one of the most useful tools for clinicians, he said. He referred to a recent study showing that an AI scribe saved users an average of 1 hour at the keyboard every day, and a small patient survey showing 71% reported that it led to spending more time with their physician.

When entering requests into a prompt line with an AI system, Dr. Yan stressed that these prompts need to be clear and concise. For a complicated calculation or multistep problem, try adding the words “let’s do this step by step,” he said. “This is a technique invoking a ‘chain of thought’ that allows the system to enhance its accuracy when solving problems.”

If the response is not satisfactory, try being more detailed in the request, he advised, and consider giving the system examples of what you’re looking for and telling it what you don’t want in the output.

“For instance, if you’re asking for a differential diagnosis of rashes that affect the hands and feet, you can stipulate that you only want rashes that are vesicular or that arise in neonates, so you can get a more focused answer,” said Dr. Yan.

If there are “long-winded verbose” responses, add the phrase “be concise,” and it will shorten the response by about 50%, he added.
 

 

 

AI Hallucinations

Dr. Yan broached an issue that occasionally comes up, AI hallucinations, which refer to inaccurate or misleading responses on the basis of incomplete training or intrinsic biases within the model. He pointed to the case of a doctor discussing issues related to a patient’s hands, feet, and mouth, which the AI-generated model summarized as “the patient being diagnosed with hand, foot, and mouth disease.”

Another example he provided was a request to generate a letter of medical necessity for using ustekinumab (Stelara) for treating hidradenitis suppurative in a child that included references for its effectiveness and safety in children. The AI system generated “false references that sounded like they should be real because the authors are often people who have written in that field or on that subject,” said Dr. Yan.

When pressed, the system did acknowledge the references were hypothetical but were meant to illustrate the types of studies that would typically support the use of this drug in pediatric patients with HS. “ It’s well meaning, in the sense that it’s trying to help you achieve your goals using this training system,” said Dr. Yan.

“If you’re skeptical about a response, double-check the answer with a Google search or run the response through another AI [tool] asking it to check if the response is accurate,” he added.

While AI systems won’t replace the clinician, they are continuing to improve and becoming more sophisticated. Dr. Yan advises keeping up with emerging developments and engaging and adapting the most appropriate AI tool for an individual clinician’s work.

Asked to comment on the presentation at the SPD meeting, Sheilagh Maguiness, MD, director of the Division of Pediatric Dermatology at the University of Minnesota, Minneapolis, who, like other doctors, is increasingly testing AI, said she foresees a time when AI scribes fully replace humans for completing tasks during patient interactions.

“The hope is that if the AI scribes get good enough, we can just open our phone, have them translate the interaction, and create the notes for us.”

While she likes the idea of using ChatGPT to help with tasks like letters of recommendation for medications, Dr. Yan’s comments reiterated the importance of “checking and double-checking ChatGPT because it’s not correct all the time.” She particularly welcomed the advice “that we can just go back and ask it again to clarify, and that may improve its answers.”

Dr. Yan’s disclosures included an investment portfolio that includes companies working in the AI space, including Google, Apple, Nvidia, Amazon, Microsoft, and Arm. Dr. Maguiness had no relevant disclosures.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

When it comes to using artificial intelligence (AI) in your practice, pediatric dermatologist Albert Yan, MD, professor of pediatrics and dermatology at the University of Pennsylvania, Philadelphia, suggests that dermatologists “just jump in” and become familiar with the various AI models.

He reminds doctors that many of their colleagues and patients and their families are already using these systems, “and you don’t want to be left behind.”

In an interview following his presentation on AI at the annual meeting of the Society for Pediatric Dermatology (SPD), Dr. Yan discussed his tips for using AI.
 

Changing Fast 

From the outset, most generative AI systems have been very good at processing language — for example, generating letters of medical necessity and summarizing disease processes into lay terms. But now they’re becoming “truly multimodal,” said Dr. Yan. “You can enter images; you could have it process audio; you can even start to have it refine video.”

To get started, he recommends signing up for a free account with ChatGPT, Gemini, Perplexity, Claude, and/or Microsoft Copilot. “To make the best choice, you have to try them out yourself because they each have their own kind of flavor and strengths and weaknesses,” said Dr. Yan.

Personally, he finds that ChatGPT is the most versatile, Gemini perhaps a little better in terms of image generation, and Perplexity probably the best at references because it was designed as an online library.



Once you figure out which platforms you prefer, consider signing up for a premium subscription, which is typically month to month and can be canceled at any time, Dr. Yan said. “This will allow you to get the most out of the AI model.”

As these AI systems are based on large language models, they are excellent at text, Dr. Yan noted. He suggests asking one to generate a letter or patient instruction sheet. “If you have a premium model, give it a PDF to summarize an article or take a photo of something that you want its opinion on.”

Privacy Critical

Always pay attention to privacy issues and avoid entering any private health information that would violate the Health Insurance Portability and Accountability Act (HIPAA), he said.

“We have to be very careful about how we interact with AI,” said Dr. Yan. “We can’t be posting private patient health information into these systems, no matter how useful these systems are.” Many academic institutions are creating “walled gardens” — private areas of AI access that don’t allow patient information to “leak out,” he said. “These AI models may have HIPAA protections in place and come with specific guidelines of use.”

The AI “scribe,” which helps with electronic health record documentation, is one of the most useful tools for clinicians, he said. He referred to a recent study showing that an AI scribe saved users an average of 1 hour at the keyboard every day, and a small patient survey showing 71% reported that it led to spending more time with their physician.

When entering requests into a prompt line with an AI system, Dr. Yan stressed that these prompts need to be clear and concise. For a complicated calculation or multistep problem, try adding the words “let’s do this step by step,” he said. “This is a technique invoking a ‘chain of thought’ that allows the system to enhance its accuracy when solving problems.”

If the response is not satisfactory, try being more detailed in the request, he advised, and consider giving the system examples of what you’re looking for and telling it what you don’t want in the output.

“For instance, if you’re asking for a differential diagnosis of rashes that affect the hands and feet, you can stipulate that you only want rashes that are vesicular or that arise in neonates, so you can get a more focused answer,” said Dr. Yan.

If there are “long-winded verbose” responses, add the phrase “be concise,” and it will shorten the response by about 50%, he added.
 

 

 

AI Hallucinations

Dr. Yan broached an issue that occasionally comes up, AI hallucinations, which refer to inaccurate or misleading responses on the basis of incomplete training or intrinsic biases within the model. He pointed to the case of a doctor discussing issues related to a patient’s hands, feet, and mouth, which the AI-generated model summarized as “the patient being diagnosed with hand, foot, and mouth disease.”

Another example he provided was a request to generate a letter of medical necessity for using ustekinumab (Stelara) for treating hidradenitis suppurative in a child that included references for its effectiveness and safety in children. The AI system generated “false references that sounded like they should be real because the authors are often people who have written in that field or on that subject,” said Dr. Yan.

When pressed, the system did acknowledge the references were hypothetical but were meant to illustrate the types of studies that would typically support the use of this drug in pediatric patients with HS. “ It’s well meaning, in the sense that it’s trying to help you achieve your goals using this training system,” said Dr. Yan.

“If you’re skeptical about a response, double-check the answer with a Google search or run the response through another AI [tool] asking it to check if the response is accurate,” he added.

While AI systems won’t replace the clinician, they are continuing to improve and becoming more sophisticated. Dr. Yan advises keeping up with emerging developments and engaging and adapting the most appropriate AI tool for an individual clinician’s work.

Asked to comment on the presentation at the SPD meeting, Sheilagh Maguiness, MD, director of the Division of Pediatric Dermatology at the University of Minnesota, Minneapolis, who, like other doctors, is increasingly testing AI, said she foresees a time when AI scribes fully replace humans for completing tasks during patient interactions.

“The hope is that if the AI scribes get good enough, we can just open our phone, have them translate the interaction, and create the notes for us.”

While she likes the idea of using ChatGPT to help with tasks like letters of recommendation for medications, Dr. Yan’s comments reiterated the importance of “checking and double-checking ChatGPT because it’s not correct all the time.” She particularly welcomed the advice “that we can just go back and ask it again to clarify, and that may improve its answers.”

Dr. Yan’s disclosures included an investment portfolio that includes companies working in the AI space, including Google, Apple, Nvidia, Amazon, Microsoft, and Arm. Dr. Maguiness had no relevant disclosures.

A version of this article first appeared on Medscape.com.

When it comes to using artificial intelligence (AI) in your practice, pediatric dermatologist Albert Yan, MD, professor of pediatrics and dermatology at the University of Pennsylvania, Philadelphia, suggests that dermatologists “just jump in” and become familiar with the various AI models.

He reminds doctors that many of their colleagues and patients and their families are already using these systems, “and you don’t want to be left behind.”

In an interview following his presentation on AI at the annual meeting of the Society for Pediatric Dermatology (SPD), Dr. Yan discussed his tips for using AI.
 

Changing Fast 

From the outset, most generative AI systems have been very good at processing language — for example, generating letters of medical necessity and summarizing disease processes into lay terms. But now they’re becoming “truly multimodal,” said Dr. Yan. “You can enter images; you could have it process audio; you can even start to have it refine video.”

To get started, he recommends signing up for a free account with ChatGPT, Gemini, Perplexity, Claude, and/or Microsoft Copilot. “To make the best choice, you have to try them out yourself because they each have their own kind of flavor and strengths and weaknesses,” said Dr. Yan.

Personally, he finds that ChatGPT is the most versatile, Gemini perhaps a little better in terms of image generation, and Perplexity probably the best at references because it was designed as an online library.



Once you figure out which platforms you prefer, consider signing up for a premium subscription, which is typically month to month and can be canceled at any time, Dr. Yan said. “This will allow you to get the most out of the AI model.”

As these AI systems are based on large language models, they are excellent at text, Dr. Yan noted. He suggests asking one to generate a letter or patient instruction sheet. “If you have a premium model, give it a PDF to summarize an article or take a photo of something that you want its opinion on.”

Privacy Critical

Always pay attention to privacy issues and avoid entering any private health information that would violate the Health Insurance Portability and Accountability Act (HIPAA), he said.

“We have to be very careful about how we interact with AI,” said Dr. Yan. “We can’t be posting private patient health information into these systems, no matter how useful these systems are.” Many academic institutions are creating “walled gardens” — private areas of AI access that don’t allow patient information to “leak out,” he said. “These AI models may have HIPAA protections in place and come with specific guidelines of use.”

The AI “scribe,” which helps with electronic health record documentation, is one of the most useful tools for clinicians, he said. He referred to a recent study showing that an AI scribe saved users an average of 1 hour at the keyboard every day, and a small patient survey showing 71% reported that it led to spending more time with their physician.

When entering requests into a prompt line with an AI system, Dr. Yan stressed that these prompts need to be clear and concise. For a complicated calculation or multistep problem, try adding the words “let’s do this step by step,” he said. “This is a technique invoking a ‘chain of thought’ that allows the system to enhance its accuracy when solving problems.”

If the response is not satisfactory, try being more detailed in the request, he advised, and consider giving the system examples of what you’re looking for and telling it what you don’t want in the output.

“For instance, if you’re asking for a differential diagnosis of rashes that affect the hands and feet, you can stipulate that you only want rashes that are vesicular or that arise in neonates, so you can get a more focused answer,” said Dr. Yan.

If there are “long-winded verbose” responses, add the phrase “be concise,” and it will shorten the response by about 50%, he added.
 

 

 

AI Hallucinations

Dr. Yan broached an issue that occasionally comes up, AI hallucinations, which refer to inaccurate or misleading responses on the basis of incomplete training or intrinsic biases within the model. He pointed to the case of a doctor discussing issues related to a patient’s hands, feet, and mouth, which the AI-generated model summarized as “the patient being diagnosed with hand, foot, and mouth disease.”

Another example he provided was a request to generate a letter of medical necessity for using ustekinumab (Stelara) for treating hidradenitis suppurative in a child that included references for its effectiveness and safety in children. The AI system generated “false references that sounded like they should be real because the authors are often people who have written in that field or on that subject,” said Dr. Yan.

When pressed, the system did acknowledge the references were hypothetical but were meant to illustrate the types of studies that would typically support the use of this drug in pediatric patients with HS. “ It’s well meaning, in the sense that it’s trying to help you achieve your goals using this training system,” said Dr. Yan.

“If you’re skeptical about a response, double-check the answer with a Google search or run the response through another AI [tool] asking it to check if the response is accurate,” he added.

While AI systems won’t replace the clinician, they are continuing to improve and becoming more sophisticated. Dr. Yan advises keeping up with emerging developments and engaging and adapting the most appropriate AI tool for an individual clinician’s work.

Asked to comment on the presentation at the SPD meeting, Sheilagh Maguiness, MD, director of the Division of Pediatric Dermatology at the University of Minnesota, Minneapolis, who, like other doctors, is increasingly testing AI, said she foresees a time when AI scribes fully replace humans for completing tasks during patient interactions.

“The hope is that if the AI scribes get good enough, we can just open our phone, have them translate the interaction, and create the notes for us.”

While she likes the idea of using ChatGPT to help with tasks like letters of recommendation for medications, Dr. Yan’s comments reiterated the importance of “checking and double-checking ChatGPT because it’s not correct all the time.” She particularly welcomed the advice “that we can just go back and ask it again to clarify, and that may improve its answers.”

Dr. Yan’s disclosures included an investment portfolio that includes companies working in the AI space, including Google, Apple, Nvidia, Amazon, Microsoft, and Arm. Dr. Maguiness had no relevant disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SPD 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article