Two Diets Linked to Improved Cognition, Slowed Brain Aging

Article Type
Changed
Wed, 07/31/2024 - 13:18

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELL METABOLISM

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study Links Newer Shingles Vaccine to Delayed Dementia Diagnosis

Article Type
Changed
Fri, 07/26/2024 - 12:24

 

Receipt of a newer recombinant version of a shingles vaccine is associated with a significant delay in dementia diagnosis in older adults, a new study suggests.

The study builds on previous observations of a reduction in dementia risk with the older live shingles vaccine and reports a delay in dementia diagnosis of 164 days with the newer recombinant version, compared with the live vaccine. 

“Given the prevalence of dementia, a delay of 164 days in diagnosis would not be a trivial effect at the public health level. It’s a big enough effect that if there is a causality it feels meaningful,” said senior author Paul Harrison, DM, FRCPsych, professor of psychiatry at the University of Oxford, Oxford, England. 

But Dr. Harrison stressed that the study had not proven that the shingles vaccine reduced dementia risk. 

“The design of the study allows us to do away with many of the confounding effects we usually see in observational studies, but this is still an observational study, and as such it cannot prove a definite causal effect,” he said. 

The study was published online on July 25 in Nature Medicine.
 

‘Natural Experiment’

Given the risk for deleterious consequences of shingles, vaccination is now recommended for older adults in many countries. The previously used live shingles vaccine (Zostavax) is being replaced in most countries with the new recombinant shingles vaccine (Shingrix), which is more effective at preventing shingles infection. 

The current study made use of a “natural experiment” in the United States, which switched over from use of the live vaccine to the recombinant vaccine in October 2017. 

Researchers used electronic heath records to compare the incidence of a dementia diagnosis in individuals who received the live shingles vaccine prior to October 2017 with those who received the recombinant version after the United States made the switch. 

They also used propensity score matching to further control for confounding factors, comparing 103,837 individuals who received a first dose of the live shingles vaccine between October 2014 and September 2017 with the same number of matched people who received the recombinant vaccine between November 2017 and October 2020. 

Results showed that within the 6 years after vaccination, the recombinant vaccine was associated with a delay in the diagnosis of dementia, compared with the live vaccine. Specifically, receiving the recombinant vaccine was associated with a 17% increase in diagnosis-free time, translating to 164 additional days lived without a diagnosis of dementia in those subsequently affected. 

As an additional control, the researchers also found significantly lower risks for dementia in individuals receiving the new recombinant shingles vaccine vs two other vaccines commonly used in older people: influenza and tetanus/diphtheria/pertussis vaccines, with increases in diagnosis-free time of 14%-27%. 

Reduced Risk or Delayed Diagnosis?

Speaking at a Science Media Centre press conference on the study, lead author Maxime Taquet, PhD, FRCPsych, clinical lecturer in psychiatry at the University of Oxford, noted that the total number of dementia cases were similar in the two shingles vaccine groups by the end of the 6-year follow-up period but there was a difference in the time at which they received a diagnosis of dementia.

“The study suggests that rather than actually reducing dementia risk, the recombinant vaccine delays the onset of dementia compared to the live vaccine in patients who go on to develop the condition,” he explained. 

But when comparing the recombinant vaccine with the influenza and tetanus/diphtheria/pertussis vaccines there was a clear reduction in dementia risk itself, Dr. Taquet reported. 

“It might well be that the live vaccine has a potential effect on the risk of dementia itself and therefore the recombinant vaccine only shows a delay in dementia compared to the live vaccine, but both of them might decrease the overall risk of dementia,” he suggested. 

But the researchers cautioned that this study could not prove causality. 

“While the two groups were very carefully matched in terms of factors that might influence the development of dementia, we still have to be cautious before assuming that the vaccine is indeed causally reducing the risk of onset of dementia,” Dr. Harrison warned. 

The researchers say the results would need to be confirmed in a randomized trial, which may have to be conducted in a slightly younger age group, as currently shingles vaccine is recommended for all older individuals in the United Kingdom. 

Vaccine recommendations vary from country to country, Dr. Harrison added. In the United States, the Centers for Disease Control and Prevention recommends the recombinant shingles vaccine for all adults aged 50 years or older. 

In the meantime, it would be interesting to see whether further observational studies in other countries find similar results as this US study, Dr. Harrison said.  
 

Mechanism Uncertain

Speculating on a possible mechanism behind the findings, Dr. Harrison suggested two plausible explanations.

“First, it is thought that the herpes virus could be one of many factors that could promote dementia, so a vaccine that stops reactivation of this virus might therefore be delaying that process,” he noted. 

The other possibility is that adjuvants included in the recombinant vaccine to stimulate the immune system might have played a role. 

“We don’t have any data on the mechanism, and thus study did not address that, so further studies are needed to look into this,” Dr. Harrison said. 
 

Stronger Effect in Women

Another intriguing finding is that the association with the recombinant vaccine and delayed dementia diagnosis seemed to be stronger in women vs men. 

In the original study of the live shingles vaccine, a protective effect against dementia was shown only in women. 

In the current study, the delay in dementia diagnosis was seen in both sexes but was stronger in women, showing a 22% increased time without dementia in women versus a 13% increased time in men with the recombinant versus the live vaccine. 

As expected, the recombinant vaccine was associated with a lower risk for shingles disease vs the live vaccine (2.5% versus 3.5%), but women did not have a better response than men did in this respect. 

“The better protection against shingles with the recombinant vaccine was similar in men and women, an observation that might be one reason to question the possible mechanism behind the dementia effect being better suppression of the herpes zoster virus by the recombinant vaccine,” Dr. Harrison commented. 

Though these findings are not likely to lead to any immediate changes in policy regarding the shingles vaccine, Dr. Harrison said it would be interesting to see whether uptake of the vaccine increased after this study. 

He estimated that, currently in the United Kingdom, about 60% of older adults choose to have the shingles vaccine. A 2020 study in the United States found that only about one-third of US adults over 60 had received the vaccine. 

“It will be interesting to see if that figure increases after these data are publicized, but I am not recommending that people have the vaccine specifically to lower their risk of dementia because of the caveats about the study that we have discussed,” he commented. 
 

Outside Experts Positive 

Outside experts, providing comment to the Science Media Centre, welcomed the new research. 

“ The study is very well-conducted and adds to previous data indicating that vaccination against shingles is associated with lower dementia risk. More research is needed in future to determine why this vaccine is associated with lower dementia risk,” said Tara Spires-Jones, FMedSci, president of the British Neuroscience Association. 

The high number of patients in the study and the adjustments for potential confounders are also strong points, noted Andrew Doig, PhD, professor of biochemistry, University of Manchester, Manchester, England.

“This is a significant result, comparable in effectiveness to the recent antibody drugs for Alzheimer’s disease,” Dr. Doig said. “Administering the recombinant shingles vaccine could well be a simple and cheap way to lower the risk of Alzheimer’s disease.”

Dr. Doig noted that a link between herpes zoster infection and the onset of dementia has been suspected for some time, and a trial of the antiviral drug valacyclovir against Alzheimer’s disease is currently underway.

In regard to the shingles vaccine, he said a placebo-controlled trial would be needed to prove causality. 

“We also need to see how many years the effect might last and whether we should vaccinate people at a younger age. We know that the path to Alzheimer’s can start decades before any symptoms are apparent, so the vaccine might be even more effective if given to people in their 40s or 50s,” he said.

Dr. Harrison and Dr. Taquet reported no disclosures. Dr. Doig is a founder, director, and consultant for PharmaKure, which works on Alzheimer’s drugs and diagnostics. Other commentators declared no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Receipt of a newer recombinant version of a shingles vaccine is associated with a significant delay in dementia diagnosis in older adults, a new study suggests.

The study builds on previous observations of a reduction in dementia risk with the older live shingles vaccine and reports a delay in dementia diagnosis of 164 days with the newer recombinant version, compared with the live vaccine. 

“Given the prevalence of dementia, a delay of 164 days in diagnosis would not be a trivial effect at the public health level. It’s a big enough effect that if there is a causality it feels meaningful,” said senior author Paul Harrison, DM, FRCPsych, professor of psychiatry at the University of Oxford, Oxford, England. 

But Dr. Harrison stressed that the study had not proven that the shingles vaccine reduced dementia risk. 

“The design of the study allows us to do away with many of the confounding effects we usually see in observational studies, but this is still an observational study, and as such it cannot prove a definite causal effect,” he said. 

The study was published online on July 25 in Nature Medicine.
 

‘Natural Experiment’

Given the risk for deleterious consequences of shingles, vaccination is now recommended for older adults in many countries. The previously used live shingles vaccine (Zostavax) is being replaced in most countries with the new recombinant shingles vaccine (Shingrix), which is more effective at preventing shingles infection. 

The current study made use of a “natural experiment” in the United States, which switched over from use of the live vaccine to the recombinant vaccine in October 2017. 

Researchers used electronic heath records to compare the incidence of a dementia diagnosis in individuals who received the live shingles vaccine prior to October 2017 with those who received the recombinant version after the United States made the switch. 

They also used propensity score matching to further control for confounding factors, comparing 103,837 individuals who received a first dose of the live shingles vaccine between October 2014 and September 2017 with the same number of matched people who received the recombinant vaccine between November 2017 and October 2020. 

Results showed that within the 6 years after vaccination, the recombinant vaccine was associated with a delay in the diagnosis of dementia, compared with the live vaccine. Specifically, receiving the recombinant vaccine was associated with a 17% increase in diagnosis-free time, translating to 164 additional days lived without a diagnosis of dementia in those subsequently affected. 

As an additional control, the researchers also found significantly lower risks for dementia in individuals receiving the new recombinant shingles vaccine vs two other vaccines commonly used in older people: influenza and tetanus/diphtheria/pertussis vaccines, with increases in diagnosis-free time of 14%-27%. 

Reduced Risk or Delayed Diagnosis?

Speaking at a Science Media Centre press conference on the study, lead author Maxime Taquet, PhD, FRCPsych, clinical lecturer in psychiatry at the University of Oxford, noted that the total number of dementia cases were similar in the two shingles vaccine groups by the end of the 6-year follow-up period but there was a difference in the time at which they received a diagnosis of dementia.

“The study suggests that rather than actually reducing dementia risk, the recombinant vaccine delays the onset of dementia compared to the live vaccine in patients who go on to develop the condition,” he explained. 

But when comparing the recombinant vaccine with the influenza and tetanus/diphtheria/pertussis vaccines there was a clear reduction in dementia risk itself, Dr. Taquet reported. 

“It might well be that the live vaccine has a potential effect on the risk of dementia itself and therefore the recombinant vaccine only shows a delay in dementia compared to the live vaccine, but both of them might decrease the overall risk of dementia,” he suggested. 

But the researchers cautioned that this study could not prove causality. 

“While the two groups were very carefully matched in terms of factors that might influence the development of dementia, we still have to be cautious before assuming that the vaccine is indeed causally reducing the risk of onset of dementia,” Dr. Harrison warned. 

The researchers say the results would need to be confirmed in a randomized trial, which may have to be conducted in a slightly younger age group, as currently shingles vaccine is recommended for all older individuals in the United Kingdom. 

Vaccine recommendations vary from country to country, Dr. Harrison added. In the United States, the Centers for Disease Control and Prevention recommends the recombinant shingles vaccine for all adults aged 50 years or older. 

In the meantime, it would be interesting to see whether further observational studies in other countries find similar results as this US study, Dr. Harrison said.  
 

Mechanism Uncertain

Speculating on a possible mechanism behind the findings, Dr. Harrison suggested two plausible explanations.

“First, it is thought that the herpes virus could be one of many factors that could promote dementia, so a vaccine that stops reactivation of this virus might therefore be delaying that process,” he noted. 

The other possibility is that adjuvants included in the recombinant vaccine to stimulate the immune system might have played a role. 

“We don’t have any data on the mechanism, and thus study did not address that, so further studies are needed to look into this,” Dr. Harrison said. 
 

Stronger Effect in Women

Another intriguing finding is that the association with the recombinant vaccine and delayed dementia diagnosis seemed to be stronger in women vs men. 

In the original study of the live shingles vaccine, a protective effect against dementia was shown only in women. 

In the current study, the delay in dementia diagnosis was seen in both sexes but was stronger in women, showing a 22% increased time without dementia in women versus a 13% increased time in men with the recombinant versus the live vaccine. 

As expected, the recombinant vaccine was associated with a lower risk for shingles disease vs the live vaccine (2.5% versus 3.5%), but women did not have a better response than men did in this respect. 

“The better protection against shingles with the recombinant vaccine was similar in men and women, an observation that might be one reason to question the possible mechanism behind the dementia effect being better suppression of the herpes zoster virus by the recombinant vaccine,” Dr. Harrison commented. 

Though these findings are not likely to lead to any immediate changes in policy regarding the shingles vaccine, Dr. Harrison said it would be interesting to see whether uptake of the vaccine increased after this study. 

He estimated that, currently in the United Kingdom, about 60% of older adults choose to have the shingles vaccine. A 2020 study in the United States found that only about one-third of US adults over 60 had received the vaccine. 

“It will be interesting to see if that figure increases after these data are publicized, but I am not recommending that people have the vaccine specifically to lower their risk of dementia because of the caveats about the study that we have discussed,” he commented. 
 

Outside Experts Positive 

Outside experts, providing comment to the Science Media Centre, welcomed the new research. 

“ The study is very well-conducted and adds to previous data indicating that vaccination against shingles is associated with lower dementia risk. More research is needed in future to determine why this vaccine is associated with lower dementia risk,” said Tara Spires-Jones, FMedSci, president of the British Neuroscience Association. 

The high number of patients in the study and the adjustments for potential confounders are also strong points, noted Andrew Doig, PhD, professor of biochemistry, University of Manchester, Manchester, England.

“This is a significant result, comparable in effectiveness to the recent antibody drugs for Alzheimer’s disease,” Dr. Doig said. “Administering the recombinant shingles vaccine could well be a simple and cheap way to lower the risk of Alzheimer’s disease.”

Dr. Doig noted that a link between herpes zoster infection and the onset of dementia has been suspected for some time, and a trial of the antiviral drug valacyclovir against Alzheimer’s disease is currently underway.

In regard to the shingles vaccine, he said a placebo-controlled trial would be needed to prove causality. 

“We also need to see how many years the effect might last and whether we should vaccinate people at a younger age. We know that the path to Alzheimer’s can start decades before any symptoms are apparent, so the vaccine might be even more effective if given to people in their 40s or 50s,” he said.

Dr. Harrison and Dr. Taquet reported no disclosures. Dr. Doig is a founder, director, and consultant for PharmaKure, which works on Alzheimer’s drugs and diagnostics. Other commentators declared no disclosures.

A version of this article first appeared on Medscape.com.

 

Receipt of a newer recombinant version of a shingles vaccine is associated with a significant delay in dementia diagnosis in older adults, a new study suggests.

The study builds on previous observations of a reduction in dementia risk with the older live shingles vaccine and reports a delay in dementia diagnosis of 164 days with the newer recombinant version, compared with the live vaccine. 

“Given the prevalence of dementia, a delay of 164 days in diagnosis would not be a trivial effect at the public health level. It’s a big enough effect that if there is a causality it feels meaningful,” said senior author Paul Harrison, DM, FRCPsych, professor of psychiatry at the University of Oxford, Oxford, England. 

But Dr. Harrison stressed that the study had not proven that the shingles vaccine reduced dementia risk. 

“The design of the study allows us to do away with many of the confounding effects we usually see in observational studies, but this is still an observational study, and as such it cannot prove a definite causal effect,” he said. 

The study was published online on July 25 in Nature Medicine.
 

‘Natural Experiment’

Given the risk for deleterious consequences of shingles, vaccination is now recommended for older adults in many countries. The previously used live shingles vaccine (Zostavax) is being replaced in most countries with the new recombinant shingles vaccine (Shingrix), which is more effective at preventing shingles infection. 

The current study made use of a “natural experiment” in the United States, which switched over from use of the live vaccine to the recombinant vaccine in October 2017. 

Researchers used electronic heath records to compare the incidence of a dementia diagnosis in individuals who received the live shingles vaccine prior to October 2017 with those who received the recombinant version after the United States made the switch. 

They also used propensity score matching to further control for confounding factors, comparing 103,837 individuals who received a first dose of the live shingles vaccine between October 2014 and September 2017 with the same number of matched people who received the recombinant vaccine between November 2017 and October 2020. 

Results showed that within the 6 years after vaccination, the recombinant vaccine was associated with a delay in the diagnosis of dementia, compared with the live vaccine. Specifically, receiving the recombinant vaccine was associated with a 17% increase in diagnosis-free time, translating to 164 additional days lived without a diagnosis of dementia in those subsequently affected. 

As an additional control, the researchers also found significantly lower risks for dementia in individuals receiving the new recombinant shingles vaccine vs two other vaccines commonly used in older people: influenza and tetanus/diphtheria/pertussis vaccines, with increases in diagnosis-free time of 14%-27%. 

Reduced Risk or Delayed Diagnosis?

Speaking at a Science Media Centre press conference on the study, lead author Maxime Taquet, PhD, FRCPsych, clinical lecturer in psychiatry at the University of Oxford, noted that the total number of dementia cases were similar in the two shingles vaccine groups by the end of the 6-year follow-up period but there was a difference in the time at which they received a diagnosis of dementia.

“The study suggests that rather than actually reducing dementia risk, the recombinant vaccine delays the onset of dementia compared to the live vaccine in patients who go on to develop the condition,” he explained. 

But when comparing the recombinant vaccine with the influenza and tetanus/diphtheria/pertussis vaccines there was a clear reduction in dementia risk itself, Dr. Taquet reported. 

“It might well be that the live vaccine has a potential effect on the risk of dementia itself and therefore the recombinant vaccine only shows a delay in dementia compared to the live vaccine, but both of them might decrease the overall risk of dementia,” he suggested. 

But the researchers cautioned that this study could not prove causality. 

“While the two groups were very carefully matched in terms of factors that might influence the development of dementia, we still have to be cautious before assuming that the vaccine is indeed causally reducing the risk of onset of dementia,” Dr. Harrison warned. 

The researchers say the results would need to be confirmed in a randomized trial, which may have to be conducted in a slightly younger age group, as currently shingles vaccine is recommended for all older individuals in the United Kingdom. 

Vaccine recommendations vary from country to country, Dr. Harrison added. In the United States, the Centers for Disease Control and Prevention recommends the recombinant shingles vaccine for all adults aged 50 years or older. 

In the meantime, it would be interesting to see whether further observational studies in other countries find similar results as this US study, Dr. Harrison said.  
 

Mechanism Uncertain

Speculating on a possible mechanism behind the findings, Dr. Harrison suggested two plausible explanations.

“First, it is thought that the herpes virus could be one of many factors that could promote dementia, so a vaccine that stops reactivation of this virus might therefore be delaying that process,” he noted. 

The other possibility is that adjuvants included in the recombinant vaccine to stimulate the immune system might have played a role. 

“We don’t have any data on the mechanism, and thus study did not address that, so further studies are needed to look into this,” Dr. Harrison said. 
 

Stronger Effect in Women

Another intriguing finding is that the association with the recombinant vaccine and delayed dementia diagnosis seemed to be stronger in women vs men. 

In the original study of the live shingles vaccine, a protective effect against dementia was shown only in women. 

In the current study, the delay in dementia diagnosis was seen in both sexes but was stronger in women, showing a 22% increased time without dementia in women versus a 13% increased time in men with the recombinant versus the live vaccine. 

As expected, the recombinant vaccine was associated with a lower risk for shingles disease vs the live vaccine (2.5% versus 3.5%), but women did not have a better response than men did in this respect. 

“The better protection against shingles with the recombinant vaccine was similar in men and women, an observation that might be one reason to question the possible mechanism behind the dementia effect being better suppression of the herpes zoster virus by the recombinant vaccine,” Dr. Harrison commented. 

Though these findings are not likely to lead to any immediate changes in policy regarding the shingles vaccine, Dr. Harrison said it would be interesting to see whether uptake of the vaccine increased after this study. 

He estimated that, currently in the United Kingdom, about 60% of older adults choose to have the shingles vaccine. A 2020 study in the United States found that only about one-third of US adults over 60 had received the vaccine. 

“It will be interesting to see if that figure increases after these data are publicized, but I am not recommending that people have the vaccine specifically to lower their risk of dementia because of the caveats about the study that we have discussed,” he commented. 
 

Outside Experts Positive 

Outside experts, providing comment to the Science Media Centre, welcomed the new research. 

“ The study is very well-conducted and adds to previous data indicating that vaccination against shingles is associated with lower dementia risk. More research is needed in future to determine why this vaccine is associated with lower dementia risk,” said Tara Spires-Jones, FMedSci, president of the British Neuroscience Association. 

The high number of patients in the study and the adjustments for potential confounders are also strong points, noted Andrew Doig, PhD, professor of biochemistry, University of Manchester, Manchester, England.

“This is a significant result, comparable in effectiveness to the recent antibody drugs for Alzheimer’s disease,” Dr. Doig said. “Administering the recombinant shingles vaccine could well be a simple and cheap way to lower the risk of Alzheimer’s disease.”

Dr. Doig noted that a link between herpes zoster infection and the onset of dementia has been suspected for some time, and a trial of the antiviral drug valacyclovir against Alzheimer’s disease is currently underway.

In regard to the shingles vaccine, he said a placebo-controlled trial would be needed to prove causality. 

“We also need to see how many years the effect might last and whether we should vaccinate people at a younger age. We know that the path to Alzheimer’s can start decades before any symptoms are apparent, so the vaccine might be even more effective if given to people in their 40s or 50s,” he said.

Dr. Harrison and Dr. Taquet reported no disclosures. Dr. Doig is a founder, director, and consultant for PharmaKure, which works on Alzheimer’s drugs and diagnostics. Other commentators declared no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NATURE MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Heat Waves: A Silent Threat to Older Adults’ Kidneys

Article Type
Changed
Tue, 08/06/2024 - 02:25

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

How the New Vitamin D Guidelines Will, and Won’t, Change My Practice

Article Type
Changed
Thu, 07/25/2024 - 15:17

Hi, everyone. I’m Dr. Kenny Lin. I am a family physician and associate director of the Lancaster General Hospital Family Medicine Residency, and I blog at Common Sense Family Doctor.

A few months ago, my health system added a clinical decision support function to our electronic health record to reduce inappropriate ordering of vitamin D levels. Clinicians are now required to select from a list of approved indications or diagnoses (including a history of vitamin D deficiency) before ordering the test.

Although I don’t know yet whether this process has had the desired effect, I felt that it was long overdue. Several years ago, I wrote an editorial that questioned the dramatic increase in vitamin D testing given the uncertainty about what level is adequate for good health and clinical trials showing that supplementing people with lower levels has no benefits for a variety of medical conditions. A more recent review of prospective studies of vitamin D supplements concluded that most correlations between vitamin D levels and outcomes in common and high-mortality conditions are unlikely to be causal.

A new Endocrine Society guideline recommends against routine measurement of vitamin D levels in healthy individuals. The guideline reinforces my current practice of not screening for vitamin D deficiency except in special situations, such as an individual with dark skin who works the night shift and rarely goes outdoors during daytime hours. But I haven’t been offering empirical vitamin D supplements to the four at-risk groups identified by the Endocrine Society: children, adults older than 75 years, pregnant patients, and adults with prediabetes. The evidence behind these recommendations merits a closer look.

In exclusively or primarily breastfed infants, I follow the American Academy of Pediatrics recommendation to prescribe a daily supplement containing 400 IU of vitamin D. However, the Endocrine Society found evidence from several studies conducted in other countries that continuing supplementation throughout childhood reduces the risk for rickets and possibly reduces the incidence of respiratory infections, with few adverse effects.

Many older women, and some older men, choose to take a calcium and vitamin D supplement for bone health, even though there is scant evidence that doing so prevents fractures in community-dwelling adults without osteoporosis. The Endocrine Society’s meta-analysis, however, found that 1000 adults aged 75 years or older who took an average of 900 IU of vitamin D daily for 2 years could expect to experience six fewer deaths than an identical group not taking supplements.

A typical prenatal vitamin contains 400 IU of vitamin D. Placebo-controlled trials reviewed by the Endocrine Society that gave an average of 2500 IU daily found statistically insignificant reductions in preeclampsia, intrauterine death, preterm birth, small for gestation age birth, and neonatal deaths.

Finally, the Endocrine Society’s recommendation for adults with prediabetes was based on 11 trials (three conducted in the United States) that tested a daily average of 3500 IU and found a slightly lower risk for progression to diabetes (24 fewer diagnoses of type 2 diabetes per 1000 persons) in the group who took supplements.

Of the four groups highlighted by the guideline, the strongest case for vitamin D supplements is in older adults — it’s hard to argue with lower mortality, even if the difference is small. Therefore, I will start suggesting that my patients over age 75 take a daily vitamin D supplement containing at least 800 IU if they aren’t already doing so.

On the other hand, I don’t plan to change my approach to pregnant patients (whose benefits in studies could have been due to chance), children after age 1 year (studies of children in other countries with different nutritional status may not apply to the United States), or adults with prediabetes (where we already have proven lifestyle interventions with much greater effects). In these cases, either I am unconvinced that the data support benefits for my patients, or I feel that the benefits of vitamin D supplements are small enough to be outweighed by potential harms, such as increased kidney stones.

Kenneth W. Lin, Associate Director, Family Medicine Residency Program, Lancaster General Hospital, Lancaster, Pennsylvania, has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Hi, everyone. I’m Dr. Kenny Lin. I am a family physician and associate director of the Lancaster General Hospital Family Medicine Residency, and I blog at Common Sense Family Doctor.

A few months ago, my health system added a clinical decision support function to our electronic health record to reduce inappropriate ordering of vitamin D levels. Clinicians are now required to select from a list of approved indications or diagnoses (including a history of vitamin D deficiency) before ordering the test.

Although I don’t know yet whether this process has had the desired effect, I felt that it was long overdue. Several years ago, I wrote an editorial that questioned the dramatic increase in vitamin D testing given the uncertainty about what level is adequate for good health and clinical trials showing that supplementing people with lower levels has no benefits for a variety of medical conditions. A more recent review of prospective studies of vitamin D supplements concluded that most correlations between vitamin D levels and outcomes in common and high-mortality conditions are unlikely to be causal.

A new Endocrine Society guideline recommends against routine measurement of vitamin D levels in healthy individuals. The guideline reinforces my current practice of not screening for vitamin D deficiency except in special situations, such as an individual with dark skin who works the night shift and rarely goes outdoors during daytime hours. But I haven’t been offering empirical vitamin D supplements to the four at-risk groups identified by the Endocrine Society: children, adults older than 75 years, pregnant patients, and adults with prediabetes. The evidence behind these recommendations merits a closer look.

In exclusively or primarily breastfed infants, I follow the American Academy of Pediatrics recommendation to prescribe a daily supplement containing 400 IU of vitamin D. However, the Endocrine Society found evidence from several studies conducted in other countries that continuing supplementation throughout childhood reduces the risk for rickets and possibly reduces the incidence of respiratory infections, with few adverse effects.

Many older women, and some older men, choose to take a calcium and vitamin D supplement for bone health, even though there is scant evidence that doing so prevents fractures in community-dwelling adults without osteoporosis. The Endocrine Society’s meta-analysis, however, found that 1000 adults aged 75 years or older who took an average of 900 IU of vitamin D daily for 2 years could expect to experience six fewer deaths than an identical group not taking supplements.

A typical prenatal vitamin contains 400 IU of vitamin D. Placebo-controlled trials reviewed by the Endocrine Society that gave an average of 2500 IU daily found statistically insignificant reductions in preeclampsia, intrauterine death, preterm birth, small for gestation age birth, and neonatal deaths.

Finally, the Endocrine Society’s recommendation for adults with prediabetes was based on 11 trials (three conducted in the United States) that tested a daily average of 3500 IU and found a slightly lower risk for progression to diabetes (24 fewer diagnoses of type 2 diabetes per 1000 persons) in the group who took supplements.

Of the four groups highlighted by the guideline, the strongest case for vitamin D supplements is in older adults — it’s hard to argue with lower mortality, even if the difference is small. Therefore, I will start suggesting that my patients over age 75 take a daily vitamin D supplement containing at least 800 IU if they aren’t already doing so.

On the other hand, I don’t plan to change my approach to pregnant patients (whose benefits in studies could have been due to chance), children after age 1 year (studies of children in other countries with different nutritional status may not apply to the United States), or adults with prediabetes (where we already have proven lifestyle interventions with much greater effects). In these cases, either I am unconvinced that the data support benefits for my patients, or I feel that the benefits of vitamin D supplements are small enough to be outweighed by potential harms, such as increased kidney stones.

Kenneth W. Lin, Associate Director, Family Medicine Residency Program, Lancaster General Hospital, Lancaster, Pennsylvania, has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Hi, everyone. I’m Dr. Kenny Lin. I am a family physician and associate director of the Lancaster General Hospital Family Medicine Residency, and I blog at Common Sense Family Doctor.

A few months ago, my health system added a clinical decision support function to our electronic health record to reduce inappropriate ordering of vitamin D levels. Clinicians are now required to select from a list of approved indications or diagnoses (including a history of vitamin D deficiency) before ordering the test.

Although I don’t know yet whether this process has had the desired effect, I felt that it was long overdue. Several years ago, I wrote an editorial that questioned the dramatic increase in vitamin D testing given the uncertainty about what level is adequate for good health and clinical trials showing that supplementing people with lower levels has no benefits for a variety of medical conditions. A more recent review of prospective studies of vitamin D supplements concluded that most correlations between vitamin D levels and outcomes in common and high-mortality conditions are unlikely to be causal.

A new Endocrine Society guideline recommends against routine measurement of vitamin D levels in healthy individuals. The guideline reinforces my current practice of not screening for vitamin D deficiency except in special situations, such as an individual with dark skin who works the night shift and rarely goes outdoors during daytime hours. But I haven’t been offering empirical vitamin D supplements to the four at-risk groups identified by the Endocrine Society: children, adults older than 75 years, pregnant patients, and adults with prediabetes. The evidence behind these recommendations merits a closer look.

In exclusively or primarily breastfed infants, I follow the American Academy of Pediatrics recommendation to prescribe a daily supplement containing 400 IU of vitamin D. However, the Endocrine Society found evidence from several studies conducted in other countries that continuing supplementation throughout childhood reduces the risk for rickets and possibly reduces the incidence of respiratory infections, with few adverse effects.

Many older women, and some older men, choose to take a calcium and vitamin D supplement for bone health, even though there is scant evidence that doing so prevents fractures in community-dwelling adults without osteoporosis. The Endocrine Society’s meta-analysis, however, found that 1000 adults aged 75 years or older who took an average of 900 IU of vitamin D daily for 2 years could expect to experience six fewer deaths than an identical group not taking supplements.

A typical prenatal vitamin contains 400 IU of vitamin D. Placebo-controlled trials reviewed by the Endocrine Society that gave an average of 2500 IU daily found statistically insignificant reductions in preeclampsia, intrauterine death, preterm birth, small for gestation age birth, and neonatal deaths.

Finally, the Endocrine Society’s recommendation for adults with prediabetes was based on 11 trials (three conducted in the United States) that tested a daily average of 3500 IU and found a slightly lower risk for progression to diabetes (24 fewer diagnoses of type 2 diabetes per 1000 persons) in the group who took supplements.

Of the four groups highlighted by the guideline, the strongest case for vitamin D supplements is in older adults — it’s hard to argue with lower mortality, even if the difference is small. Therefore, I will start suggesting that my patients over age 75 take a daily vitamin D supplement containing at least 800 IU if they aren’t already doing so.

On the other hand, I don’t plan to change my approach to pregnant patients (whose benefits in studies could have been due to chance), children after age 1 year (studies of children in other countries with different nutritional status may not apply to the United States), or adults with prediabetes (where we already have proven lifestyle interventions with much greater effects). In these cases, either I am unconvinced that the data support benefits for my patients, or I feel that the benefits of vitamin D supplements are small enough to be outweighed by potential harms, such as increased kidney stones.

Kenneth W. Lin, Associate Director, Family Medicine Residency Program, Lancaster General Hospital, Lancaster, Pennsylvania, has disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New Criteria Distinguish Memory Disorder Often Misdiagnosed as Alzheimer’s

Article Type
Changed
Thu, 07/25/2024 - 15:04

Proposed clinical criteria for a memory loss disorder that is often misdiagnosed as Alzheimer’s disease (AD) have been published.

The new criteria for limbic-predominant amnestic neurodegenerative syndrome (LANS) provide a framework for neurologists and other experts to classify the condition and offer a more precise diagnosis and potential treatments.

“In our clinical work, we see patients whose memory symptoms appear to mimic Alzheimer’s disease, but when you look at their brain imaging or biomarkers, it’s clear they don’t have Alzheimer’s. Until now, there has not been a specific medical diagnosis to point to, but now we can offer them some answers,” senior investigator David T. Jones, MD, said in a release.

The proposed criteria and the research behind it were published online in Brain Communications and will be presented at the Alzheimer›s Association International Conference in Philadelphia.
 

Already in Use

Predominant limbic degeneration has been linked to various underlying etiologies, older age, predominant impairment of episodic memory, and slow clinical progression, the investigators noted. However, they added, the neurologic syndrome associated with predominant limbic degeneration is undefined.

Developing clinical criteria and validating them “is critical to distinguish such a syndrome from those originating from neocortical degeneration, which may differ in underlying etiology, disease course, and therapeutic needs,” the investigators wrote.

The newly proposed clinical criteria apply to LANS, which is “highly associated with limbic-predominant age-related TDP-43 encephalopathy but also other pathologic entities.”

The criteria incorporate core, standard, and advanced features including older age at evaluation, mild clinical syndrome, disproportionate hippocampal atrophy, impaired semantic memory, limbic hypometabolism, absence of endocortical degeneration, and low likelihood of neocortical tau with highest, high, moderate, and low degrees of certainty.

“A detailed history of the clinical symptoms, which may be supported by neuropsychological testing, with the observation of disproportionate hippocampal atrophy and limbic degeneration on MRI/FDG yields a high confidence in a diagnosis of LANS, where the most likely symptom-driving proteinopathy is TDP-43 and not Alzheimer’s associated proteins,” the first author, Nick Corriveau-Lecavalier, PhD, assistant professor of neurology and psychology at Mayo Clinic, Rochester, Minnesota, told this news organization.

To validate the criteria, the investigators screened autopsied patients from Mayo Clinic and Alzheimer’s Disease Neuroimaging Initiative cohorts and applied the criteria to those with a predominant amnestic syndrome and those who had AD neuropathologic change, limbic-predominant age-related TDP-43 encephalopathy, or both pathologies at autopsy.

“The criteria effectively categorized these cases, with Alzheimer’s disease having the lowest likelihoods, limbic-predominant age-related TDP-43 encephalopathy patients having the highest likelihoods, and patients with both pathologies having intermediate likelihoods,” the investigators reported.

“Patients with high likelihoods had a milder and slower clinical course and more severe temporo-limbic degeneration compared to those with low likelihoods,” they added.

Dr. Corriveau-Lecavalier said the team is currently analyzing longitudinal cognitive and imaging trajectories in LANS over several years. “This will help us better understand how LANS and Alzheimer’s differ in their sequence of symptoms over time.”

It is important to understand that memory symptoms in old age are not “unequivocally” driven by Alzheimer’s and that LANS progresses more slowly and has a better prognosis than AD, he noted.

In addition, in vivo markers of TDP-43 are “on the horizon and can hopefully make their way to human research settings soon. This will help better understand the underlying molecular etiologies causing LANS and associated symptoms,” he said.

Dr. Corriveau-Lecavalier said the LANS criteria are ready for clinical use by experts in neurologic care. These criteria can be used to inform not only diagnosis but also prognosis, where this syndrome is associated with slow and mild progression and a memory-dominant profile.

He added that “the new criteria are also routinely used in our practice to make decisions about anti-amyloid treatment eligibility.”

Commenting on the research for this news organization, Rebecca M. Edelmayer, PhD, Alzheimer’s Association senior director of scientific engagement, said the research “exemplifies the great need to develop objective criteria for diagnosis and staging of Alzheimer’s and all other types of dementia and to create an integrated biological and clinical staging scheme that can be used effectively by physicians.”

“Advances in biomarkers will help to differentiate all types of dementia when incorporated into the diagnostic workup, but until those tools are available, a more succinct clinical criteria for diagnosis can be used to support a more personalized medicine approach to treatment, care, and enrollment into clinical studies,” said Dr. Edelmayer, who wasn’t involved in the research.

The research was funded in part by the National Institutes of Health and by the Robert Wood Johnson Foundation, the Elsie & Marvin Dekelboum Family Foundation, the Liston Family Foundation, the Edson Family, the Gerald A. and Henrietta Rauenhorst Foundation, and the Foundation Dr Corinne Schuler. Dr. Corriveau-Lecavalier and Dr. Edelmayer had no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Proposed clinical criteria for a memory loss disorder that is often misdiagnosed as Alzheimer’s disease (AD) have been published.

The new criteria for limbic-predominant amnestic neurodegenerative syndrome (LANS) provide a framework for neurologists and other experts to classify the condition and offer a more precise diagnosis and potential treatments.

“In our clinical work, we see patients whose memory symptoms appear to mimic Alzheimer’s disease, but when you look at their brain imaging or biomarkers, it’s clear they don’t have Alzheimer’s. Until now, there has not been a specific medical diagnosis to point to, but now we can offer them some answers,” senior investigator David T. Jones, MD, said in a release.

The proposed criteria and the research behind it were published online in Brain Communications and will be presented at the Alzheimer›s Association International Conference in Philadelphia.
 

Already in Use

Predominant limbic degeneration has been linked to various underlying etiologies, older age, predominant impairment of episodic memory, and slow clinical progression, the investigators noted. However, they added, the neurologic syndrome associated with predominant limbic degeneration is undefined.

Developing clinical criteria and validating them “is critical to distinguish such a syndrome from those originating from neocortical degeneration, which may differ in underlying etiology, disease course, and therapeutic needs,” the investigators wrote.

The newly proposed clinical criteria apply to LANS, which is “highly associated with limbic-predominant age-related TDP-43 encephalopathy but also other pathologic entities.”

The criteria incorporate core, standard, and advanced features including older age at evaluation, mild clinical syndrome, disproportionate hippocampal atrophy, impaired semantic memory, limbic hypometabolism, absence of endocortical degeneration, and low likelihood of neocortical tau with highest, high, moderate, and low degrees of certainty.

“A detailed history of the clinical symptoms, which may be supported by neuropsychological testing, with the observation of disproportionate hippocampal atrophy and limbic degeneration on MRI/FDG yields a high confidence in a diagnosis of LANS, where the most likely symptom-driving proteinopathy is TDP-43 and not Alzheimer’s associated proteins,” the first author, Nick Corriveau-Lecavalier, PhD, assistant professor of neurology and psychology at Mayo Clinic, Rochester, Minnesota, told this news organization.

To validate the criteria, the investigators screened autopsied patients from Mayo Clinic and Alzheimer’s Disease Neuroimaging Initiative cohorts and applied the criteria to those with a predominant amnestic syndrome and those who had AD neuropathologic change, limbic-predominant age-related TDP-43 encephalopathy, or both pathologies at autopsy.

“The criteria effectively categorized these cases, with Alzheimer’s disease having the lowest likelihoods, limbic-predominant age-related TDP-43 encephalopathy patients having the highest likelihoods, and patients with both pathologies having intermediate likelihoods,” the investigators reported.

“Patients with high likelihoods had a milder and slower clinical course and more severe temporo-limbic degeneration compared to those with low likelihoods,” they added.

Dr. Corriveau-Lecavalier said the team is currently analyzing longitudinal cognitive and imaging trajectories in LANS over several years. “This will help us better understand how LANS and Alzheimer’s differ in their sequence of symptoms over time.”

It is important to understand that memory symptoms in old age are not “unequivocally” driven by Alzheimer’s and that LANS progresses more slowly and has a better prognosis than AD, he noted.

In addition, in vivo markers of TDP-43 are “on the horizon and can hopefully make their way to human research settings soon. This will help better understand the underlying molecular etiologies causing LANS and associated symptoms,” he said.

Dr. Corriveau-Lecavalier said the LANS criteria are ready for clinical use by experts in neurologic care. These criteria can be used to inform not only diagnosis but also prognosis, where this syndrome is associated with slow and mild progression and a memory-dominant profile.

He added that “the new criteria are also routinely used in our practice to make decisions about anti-amyloid treatment eligibility.”

Commenting on the research for this news organization, Rebecca M. Edelmayer, PhD, Alzheimer’s Association senior director of scientific engagement, said the research “exemplifies the great need to develop objective criteria for diagnosis and staging of Alzheimer’s and all other types of dementia and to create an integrated biological and clinical staging scheme that can be used effectively by physicians.”

“Advances in biomarkers will help to differentiate all types of dementia when incorporated into the diagnostic workup, but until those tools are available, a more succinct clinical criteria for diagnosis can be used to support a more personalized medicine approach to treatment, care, and enrollment into clinical studies,” said Dr. Edelmayer, who wasn’t involved in the research.

The research was funded in part by the National Institutes of Health and by the Robert Wood Johnson Foundation, the Elsie & Marvin Dekelboum Family Foundation, the Liston Family Foundation, the Edson Family, the Gerald A. and Henrietta Rauenhorst Foundation, and the Foundation Dr Corinne Schuler. Dr. Corriveau-Lecavalier and Dr. Edelmayer had no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Proposed clinical criteria for a memory loss disorder that is often misdiagnosed as Alzheimer’s disease (AD) have been published.

The new criteria for limbic-predominant amnestic neurodegenerative syndrome (LANS) provide a framework for neurologists and other experts to classify the condition and offer a more precise diagnosis and potential treatments.

“In our clinical work, we see patients whose memory symptoms appear to mimic Alzheimer’s disease, but when you look at their brain imaging or biomarkers, it’s clear they don’t have Alzheimer’s. Until now, there has not been a specific medical diagnosis to point to, but now we can offer them some answers,” senior investigator David T. Jones, MD, said in a release.

The proposed criteria and the research behind it were published online in Brain Communications and will be presented at the Alzheimer›s Association International Conference in Philadelphia.
 

Already in Use

Predominant limbic degeneration has been linked to various underlying etiologies, older age, predominant impairment of episodic memory, and slow clinical progression, the investigators noted. However, they added, the neurologic syndrome associated with predominant limbic degeneration is undefined.

Developing clinical criteria and validating them “is critical to distinguish such a syndrome from those originating from neocortical degeneration, which may differ in underlying etiology, disease course, and therapeutic needs,” the investigators wrote.

The newly proposed clinical criteria apply to LANS, which is “highly associated with limbic-predominant age-related TDP-43 encephalopathy but also other pathologic entities.”

The criteria incorporate core, standard, and advanced features including older age at evaluation, mild clinical syndrome, disproportionate hippocampal atrophy, impaired semantic memory, limbic hypometabolism, absence of endocortical degeneration, and low likelihood of neocortical tau with highest, high, moderate, and low degrees of certainty.

“A detailed history of the clinical symptoms, which may be supported by neuropsychological testing, with the observation of disproportionate hippocampal atrophy and limbic degeneration on MRI/FDG yields a high confidence in a diagnosis of LANS, where the most likely symptom-driving proteinopathy is TDP-43 and not Alzheimer’s associated proteins,” the first author, Nick Corriveau-Lecavalier, PhD, assistant professor of neurology and psychology at Mayo Clinic, Rochester, Minnesota, told this news organization.

To validate the criteria, the investigators screened autopsied patients from Mayo Clinic and Alzheimer’s Disease Neuroimaging Initiative cohorts and applied the criteria to those with a predominant amnestic syndrome and those who had AD neuropathologic change, limbic-predominant age-related TDP-43 encephalopathy, or both pathologies at autopsy.

“The criteria effectively categorized these cases, with Alzheimer’s disease having the lowest likelihoods, limbic-predominant age-related TDP-43 encephalopathy patients having the highest likelihoods, and patients with both pathologies having intermediate likelihoods,” the investigators reported.

“Patients with high likelihoods had a milder and slower clinical course and more severe temporo-limbic degeneration compared to those with low likelihoods,” they added.

Dr. Corriveau-Lecavalier said the team is currently analyzing longitudinal cognitive and imaging trajectories in LANS over several years. “This will help us better understand how LANS and Alzheimer’s differ in their sequence of symptoms over time.”

It is important to understand that memory symptoms in old age are not “unequivocally” driven by Alzheimer’s and that LANS progresses more slowly and has a better prognosis than AD, he noted.

In addition, in vivo markers of TDP-43 are “on the horizon and can hopefully make their way to human research settings soon. This will help better understand the underlying molecular etiologies causing LANS and associated symptoms,” he said.

Dr. Corriveau-Lecavalier said the LANS criteria are ready for clinical use by experts in neurologic care. These criteria can be used to inform not only diagnosis but also prognosis, where this syndrome is associated with slow and mild progression and a memory-dominant profile.

He added that “the new criteria are also routinely used in our practice to make decisions about anti-amyloid treatment eligibility.”

Commenting on the research for this news organization, Rebecca M. Edelmayer, PhD, Alzheimer’s Association senior director of scientific engagement, said the research “exemplifies the great need to develop objective criteria for diagnosis and staging of Alzheimer’s and all other types of dementia and to create an integrated biological and clinical staging scheme that can be used effectively by physicians.”

“Advances in biomarkers will help to differentiate all types of dementia when incorporated into the diagnostic workup, but until those tools are available, a more succinct clinical criteria for diagnosis can be used to support a more personalized medicine approach to treatment, care, and enrollment into clinical studies,” said Dr. Edelmayer, who wasn’t involved in the research.

The research was funded in part by the National Institutes of Health and by the Robert Wood Johnson Foundation, the Elsie & Marvin Dekelboum Family Foundation, the Liston Family Foundation, the Edson Family, the Gerald A. and Henrietta Rauenhorst Foundation, and the Foundation Dr Corinne Schuler. Dr. Corriveau-Lecavalier and Dr. Edelmayer had no relevant conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Prognostication in Hospice Care: Challenges, Opportunities, and the Importance of Functional Status

Article Type
Changed
Thu, 08/01/2024 - 09:44

Predicting life expectancy and providing an end-of-life diagnosis in hospice and palliative care is a challenge for most clinicians. Lack of training, limited communication skills, and relationships with patients are all contributing factors. These skills can improve with the use of functional scoring tools in conjunction with the patient’s comorbidities and physical/psychological symptoms. The Palliative Performance Scale (PPS), Karnofsky Performance Scale (KPS), and Eastern Cooperative Oncology Group Performance Status Scale (ECOG) are commonly used functional scoring tools.

 

The PPS measures 5 functional dimensions including ambulation, activity level, ability to administer self-care, oral intake, and level of consciousness.1 It has been shown to be valid for a broad range of palliative care patients, including those with advanced cancer or life-threatening noncancer diagnoses in hospitals or hospice care.2 The scale, measured in 10% increments, runs from 100% (completely functional) to 0% (dead). A PPS ≤ 70% helps meet hospice eligibility criteria.

The KPS evaluates functional impairment and helps with prognostication. Developed in 1948, it evaluates a patient’s functional ability to tolerate chemotherapy, specifically in lung cancer,and has since been validated to predict mortality across older adults and in chronic disease populations.3,4 The KPS is also measured in 10% increments ranging from 100% (completely functional without assistance) to 0% (dead). A KPS ≤ 70% assists with hospice eligibility criteria (Table 1).5

Developed in 1974, the ECOG has been identified as one of the most important functional status tools in adult cancer care.6 It describes a cancer patient’s functional ability, evaluating their ability to care for oneself and participate in daily activities.7 The ECOG is a 6-point scale; patients can receive scores ranging from 0 (fully active) to 5 (dead). An ECOG score of 4 (sometimes 3) is generally supportive of meeting hospice eligibility (Table 2).6

 

 

CASE Presentation

An 80-year-old patient was admitted to the hospice service at the Veterans Affairs Puget Sound Health Care System (VAPSHCS) community living center (CLC) in Tacoma, Washington, from a community-based acute care hospital. His medical history included prostate cancer with metastasis to his pelvis and type 2 diabetes mellitus, which was stable with treatment with oral medication. Six weeks earlier the patient reported a severe frontal headache that was not responding to over-the-counter analgesics. After 2 days with these symptoms, including a ground-level fall without injuries, he presented to the VAPSHCS emergency department (ED) where a complete neurological examination, including magnetic resonance imaging, revealed a left frontoparietal brain lesion that was 4.2 cm × 3.4 cm × 4.2 cm.

The patient experienced a seizure during his ED evaluation and was admitted for treatment. He underwent a craniotomy where most, but not all the lesions were successfully removed. Postoperatively, the patient exhibited right-sided neglect, gait instability, emotional lability, and cognitive communication disorder. The patient completed 15 of 20 planned radiation treatments but declined further radiation or chemotherapy. The patient decided to halt radiation treatments after being informed by the oncology service that the treatments would likely only add 1 to 2 months to his overall survival, which was < 6 months. The patient elected to focus his goals of care on comfort, dignity, and respect at the end of life and accepted recommendations to be placed into end-of-life hospice care. He was then transferred to the VAPSHCS CLC in Tacoma, Washington, for hospice care.

Upon admission, the patient weighed 94 kg, his vital signs were within reference range, and he reported no pain or headaches. His initial laboratory results revealed a 13.2 g/dL hemoglobin, 3.6 g/dL serum albumin, and a 5.5% hemoglobin A1c, all of which fall into a normal reference range. He had a reported ECOG score of 3 and a KPS score of 50% by the transferring medical team. The patient’s medications included scheduled dexamethasone, metformin, senna, levetiracetam, and as-needed midazolam nasal spray for breakthrough seizures. He also had as-needed acetaminophen for pain. He was alert, oriented ×3, and fully ambulatory but continuously used a 4-wheeled walker for safety and gait instability.

After the patient’s first night, the hospice team met with him to discuss his understanding of his health issues. The patient appeared to have low health literacy but told the team, “I know I am dying.” He had completed written advance directives and a Portable Order for Life-Sustaining Treatment indicating that life-sustaining treatments, including cardiopulmonary resuscitation, supplemental mechanical feeding, or intubation, were not to be used to keep him alive.

At his first 90-day recertification, the patient had gained 8 kg and laboratory results revealed a 14.6 g/dL hemoglobin, 3.8 g/dL serum albumin, and a 6.1% hemoglobin A1c. His ECOG score remained at 3, but his KPS score had increased to 60%. The patient exhibited no new neurologic symptoms or seizures and reported no headaches but had 2 ground-level falls without injury. On both occasions the patient chose not to use his walker to go to the bathroom because it was “too far from my bed.” Per VA policy, after discussions with the hospice team, he was recertified for 90 more days of hospice care. At the end of 6 months in CLC, the patient’s weight remained stable, as did his complete blood count and comprehensive medical panel. He had 1 additional noninjurious ground-level fall and again reported no pain and no use of as-needed acetaminophen. His only medical complication was testing positive for COVID-19, but he remained asymptomatic. The patient was graduated from hospice care and referred to a nearby non-VA adult family home in the community after 180 days. At that time his ECOG score was 2 and his KPS score had increased to 70%.

 

 

DISCUSSION

Primary brain tumors account for about 2% of all malignant neoplasms in adults. About half of them represent gliomas. Glioblastoma multiforme derived from neuroepithelial cells is the most frequent and deadly primary malignant central nervous system tumor in adults.8 About 50% of patients with glioblastomas are aged ≥ 65 years at diagnosis.9 A retrospective study of Centers for Medicare and Medicaid Services claims data paired with the Surveillance, Epidemiology, and End Results database indicated a median survival of 4 months for patients with glioblastoma multiforme aged > 65 years, including all treatment modalities.10 Surgical resection combined with radiation and chemotherapy offers the best prognosis for the preservation of neurologic function.11 However, comorbidities, adverse drug effects, and the potential for postoperative complications pose significant risks, especially for older patients. Ultimately, goals of care conversations and advance directives play a very important role in evaluating benefits vs risks with this malignancy.

Our patient was aged 80 years and had previously been diagnosed with metastatic prostate malignancy. His goals of care focused on spending time with his friends, leaving his room to eat in the facility dining area, and continuing his daily walks. He remained clear that he did not want his care team to institute life-sustaining treatments to be kept alive and felt the information regarding the risks vs benefits of accepting chemotherapy was not aligned with his goals of care. Over the 6 months that he received hospice care, he gained weight, improved his hemoglobin and serum albumin levels, and ambulated with the use of a 4-wheeled walker. As the patient exhibited no functional decline or new comorbidities and his functional status improved, the clinical staff felt he no longer needed hospice services. The patient had an ECOG score of 2 and a KPS score of 70% at his hospice graduation.

Medical prognostication is one of the biggest challenges clinicians face. Clinicians are generally “over prognosticators,” and their thoughts tend to be based on the patient relationship, overall experiences in health care, and desire to treat and cure patients.12 In hospice we are asked to define the usual, normal, or expected course of a disease, but what does that mean? Although metastatic malignancies usually have a predictable course in comparison to diagnoses such as dementia, chronic obstructive pulmonary disease, or congestive heart failure, the challenges to improve prognostic ability andpredict disease course continue.13-15 Focusing on functional status, goals of care, and comorbidities are keys to helping with prognosis. Given the challenge, we find the PPS, KPS, and ECOG scales important tools.

When prognosticating, we attempt to define quantity and quality of life (which our patients must define independently or from the voice of their surrogate) and their ability to perform daily activities. Quality of life in patients with glioblastoma is progressively and significantly impacted due to the emergence of debilitating neurologic symptoms arising from infiltrative tumor growth into functionally intact brain tissue that restricts and disrupts normal day-to-day activities. However, functional status plays a significant role in helping the hospice team improve its overall prognosis.

 

Conclusions

This case study illustrates the difficulty that comes with prognostication(s) despite a patient's severely morbid disease, history of metastatic prostate cancer, and advanced age. Although a diagnosis may be concerning, documenting a patient’s status using functional scales prior to hospice admission and during the recertification process is helpful in prognostication. Doing so will allow health care professionals to have an accepted medical standard to use regardless how distinct the patient's diagnosis. The expression, “as the disease does not read the textbook,” may serve as a helpful reminder in talking with patients and their families. This is important as most patient’s clinical disease courses are different and having the opportunity to use performance status scales may help improve prognostic skills.

References

1. Cleary TA. The Palliative Performance Scale (PPSv2) Version 2. In: Downing GM, ed. Medical Care of the Dying. 4th ed. Victoria Hospice Society, Learning Centre for Palliative Care; 2006:120.

2. Palliative Performance Scale. ePrognosis, University of California San Francisco. Accessed June 14, 2024. https://eprognosis.ucsf.edu/pps.php

3. Karnofsky DA, Burchenal JH. The Clinical Evaluation of Chemotherapeutic Agents in Cancer. In: MacLeod CM, ed. Evaluation of Chemotherapeutic Agents. Columbia University Press; 1949:191-205.

4. Khalid MA, Achakzai IK, Ahmed Khan S, et al. The use of Karnofsky Performance Status (KPS) as a predictor of 3 month post discharge mortality in cirrhotic patients. Gastroenterol Hepatol Bed Bench. 2018;11(4):301-305.

5. Karnofsky Performance Scale. US Dept of Veterans Affairs. Accessed June 14, 2024. https://www.hiv.va.gov/provider/tools/karnofsky-performance-scale.asp

6. Mischel A-M, Rosielle DA. Eastern Cooperative Oncology Group Performance Status. Palliative Care Network of Wisconsin. December 10, 2021. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/eastern-cooperative-oncology-group-performance-status/

7. Oken MM, Creech RH, Tormey DC, et al. Toxicity and response criteria of the Eastern Cooperative Oncology Group. Am J Clin Oncol. 1982;5(6):649-655.

8. Nizamutdinov D, Stock EM, Dandashi JA, et al. Prognostication of survival outcomes in patients diagnosed with glioblastoma. World Neurosurg. 2018;109:e67-e74. doi:10.1016/j.wneu.2017.09.104

9. Kita D, Ciernik IFVaccarella S, et al. Age as a predictive factor in glioblastomas: population-based study. Neuroepidemiology. 2009;33(1):17-22. doi:10.1159/000210017

10. Jordan JT, Gerstner ER, Batchelor TT, Cahill DP, Plotkin SR. Glioblastoma care in the elderly. Cancer. 2016;122(2):189-197. doi:10.1002/cnr.29742

11. Brown, NF, Ottaviani D, Tazare J, et al. Survival outcomes and prognostic factors in glioblastoma. Cancers (Basel). 2022;14(13):3161. doi:10.3390/cancers14133161

12. Christalakis NA. Death Foretold: Prophecy and Prognosis in Medical Care. University of Chicago Press; 2000.

13. Weissman DE. Determining Prognosis in Advanced Cancer. Palliative Care Network of Wisconsin. January 28, 2019. Accessed June 14, 2014. https://www.mypcnow.org/fast-fact/determining-prognosis-in-advanced-cancer/

14. Childers JW, Arnold R, Curtis JR. Prognosis in End-Stage COPD. Palliative Care Network of Wisconsin. February 11, 2019. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/prognosis-in-end-stage-copd/

15. Reisfield GM, Wilson GR. Prognostication in Heart Failure. Palliative Care Network of Wisconsin. February 11, 2019. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/prognostication-in-heart-failure/

Article PDF
Author and Disclosure Information

David B. Brecher, MDa; Heather J. Sabol, MSN, ARNPa

Correspondence:  David Brecher  (david.brecher@va.gov)

aVeterans Affairs Puget Sound Health Care System, Tacoma, Washington

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

Written informed consent was obtained from the patient and patient identifiers were removed to protect the patient’s identity.

Issue
Federal Practitioner - 41(8)s
Publications
Topics
Page Number
S50-S53
Sections
Author and Disclosure Information

David B. Brecher, MDa; Heather J. Sabol, MSN, ARNPa

Correspondence:  David Brecher  (david.brecher@va.gov)

aVeterans Affairs Puget Sound Health Care System, Tacoma, Washington

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

Written informed consent was obtained from the patient and patient identifiers were removed to protect the patient’s identity.

Author and Disclosure Information

David B. Brecher, MDa; Heather J. Sabol, MSN, ARNPa

Correspondence:  David Brecher  (david.brecher@va.gov)

aVeterans Affairs Puget Sound Health Care System, Tacoma, Washington

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

Written informed consent was obtained from the patient and patient identifiers were removed to protect the patient’s identity.

Article PDF
Article PDF

Predicting life expectancy and providing an end-of-life diagnosis in hospice and palliative care is a challenge for most clinicians. Lack of training, limited communication skills, and relationships with patients are all contributing factors. These skills can improve with the use of functional scoring tools in conjunction with the patient’s comorbidities and physical/psychological symptoms. The Palliative Performance Scale (PPS), Karnofsky Performance Scale (KPS), and Eastern Cooperative Oncology Group Performance Status Scale (ECOG) are commonly used functional scoring tools.

 

The PPS measures 5 functional dimensions including ambulation, activity level, ability to administer self-care, oral intake, and level of consciousness.1 It has been shown to be valid for a broad range of palliative care patients, including those with advanced cancer or life-threatening noncancer diagnoses in hospitals or hospice care.2 The scale, measured in 10% increments, runs from 100% (completely functional) to 0% (dead). A PPS ≤ 70% helps meet hospice eligibility criteria.

The KPS evaluates functional impairment and helps with prognostication. Developed in 1948, it evaluates a patient’s functional ability to tolerate chemotherapy, specifically in lung cancer,and has since been validated to predict mortality across older adults and in chronic disease populations.3,4 The KPS is also measured in 10% increments ranging from 100% (completely functional without assistance) to 0% (dead). A KPS ≤ 70% assists with hospice eligibility criteria (Table 1).5

Developed in 1974, the ECOG has been identified as one of the most important functional status tools in adult cancer care.6 It describes a cancer patient’s functional ability, evaluating their ability to care for oneself and participate in daily activities.7 The ECOG is a 6-point scale; patients can receive scores ranging from 0 (fully active) to 5 (dead). An ECOG score of 4 (sometimes 3) is generally supportive of meeting hospice eligibility (Table 2).6

 

 

CASE Presentation

An 80-year-old patient was admitted to the hospice service at the Veterans Affairs Puget Sound Health Care System (VAPSHCS) community living center (CLC) in Tacoma, Washington, from a community-based acute care hospital. His medical history included prostate cancer with metastasis to his pelvis and type 2 diabetes mellitus, which was stable with treatment with oral medication. Six weeks earlier the patient reported a severe frontal headache that was not responding to over-the-counter analgesics. After 2 days with these symptoms, including a ground-level fall without injuries, he presented to the VAPSHCS emergency department (ED) where a complete neurological examination, including magnetic resonance imaging, revealed a left frontoparietal brain lesion that was 4.2 cm × 3.4 cm × 4.2 cm.

The patient experienced a seizure during his ED evaluation and was admitted for treatment. He underwent a craniotomy where most, but not all the lesions were successfully removed. Postoperatively, the patient exhibited right-sided neglect, gait instability, emotional lability, and cognitive communication disorder. The patient completed 15 of 20 planned radiation treatments but declined further radiation or chemotherapy. The patient decided to halt radiation treatments after being informed by the oncology service that the treatments would likely only add 1 to 2 months to his overall survival, which was < 6 months. The patient elected to focus his goals of care on comfort, dignity, and respect at the end of life and accepted recommendations to be placed into end-of-life hospice care. He was then transferred to the VAPSHCS CLC in Tacoma, Washington, for hospice care.

Upon admission, the patient weighed 94 kg, his vital signs were within reference range, and he reported no pain or headaches. His initial laboratory results revealed a 13.2 g/dL hemoglobin, 3.6 g/dL serum albumin, and a 5.5% hemoglobin A1c, all of which fall into a normal reference range. He had a reported ECOG score of 3 and a KPS score of 50% by the transferring medical team. The patient’s medications included scheduled dexamethasone, metformin, senna, levetiracetam, and as-needed midazolam nasal spray for breakthrough seizures. He also had as-needed acetaminophen for pain. He was alert, oriented ×3, and fully ambulatory but continuously used a 4-wheeled walker for safety and gait instability.

After the patient’s first night, the hospice team met with him to discuss his understanding of his health issues. The patient appeared to have low health literacy but told the team, “I know I am dying.” He had completed written advance directives and a Portable Order for Life-Sustaining Treatment indicating that life-sustaining treatments, including cardiopulmonary resuscitation, supplemental mechanical feeding, or intubation, were not to be used to keep him alive.

At his first 90-day recertification, the patient had gained 8 kg and laboratory results revealed a 14.6 g/dL hemoglobin, 3.8 g/dL serum albumin, and a 6.1% hemoglobin A1c. His ECOG score remained at 3, but his KPS score had increased to 60%. The patient exhibited no new neurologic symptoms or seizures and reported no headaches but had 2 ground-level falls without injury. On both occasions the patient chose not to use his walker to go to the bathroom because it was “too far from my bed.” Per VA policy, after discussions with the hospice team, he was recertified for 90 more days of hospice care. At the end of 6 months in CLC, the patient’s weight remained stable, as did his complete blood count and comprehensive medical panel. He had 1 additional noninjurious ground-level fall and again reported no pain and no use of as-needed acetaminophen. His only medical complication was testing positive for COVID-19, but he remained asymptomatic. The patient was graduated from hospice care and referred to a nearby non-VA adult family home in the community after 180 days. At that time his ECOG score was 2 and his KPS score had increased to 70%.

 

 

DISCUSSION

Primary brain tumors account for about 2% of all malignant neoplasms in adults. About half of them represent gliomas. Glioblastoma multiforme derived from neuroepithelial cells is the most frequent and deadly primary malignant central nervous system tumor in adults.8 About 50% of patients with glioblastomas are aged ≥ 65 years at diagnosis.9 A retrospective study of Centers for Medicare and Medicaid Services claims data paired with the Surveillance, Epidemiology, and End Results database indicated a median survival of 4 months for patients with glioblastoma multiforme aged > 65 years, including all treatment modalities.10 Surgical resection combined with radiation and chemotherapy offers the best prognosis for the preservation of neurologic function.11 However, comorbidities, adverse drug effects, and the potential for postoperative complications pose significant risks, especially for older patients. Ultimately, goals of care conversations and advance directives play a very important role in evaluating benefits vs risks with this malignancy.

Our patient was aged 80 years and had previously been diagnosed with metastatic prostate malignancy. His goals of care focused on spending time with his friends, leaving his room to eat in the facility dining area, and continuing his daily walks. He remained clear that he did not want his care team to institute life-sustaining treatments to be kept alive and felt the information regarding the risks vs benefits of accepting chemotherapy was not aligned with his goals of care. Over the 6 months that he received hospice care, he gained weight, improved his hemoglobin and serum albumin levels, and ambulated with the use of a 4-wheeled walker. As the patient exhibited no functional decline or new comorbidities and his functional status improved, the clinical staff felt he no longer needed hospice services. The patient had an ECOG score of 2 and a KPS score of 70% at his hospice graduation.

Medical prognostication is one of the biggest challenges clinicians face. Clinicians are generally “over prognosticators,” and their thoughts tend to be based on the patient relationship, overall experiences in health care, and desire to treat and cure patients.12 In hospice we are asked to define the usual, normal, or expected course of a disease, but what does that mean? Although metastatic malignancies usually have a predictable course in comparison to diagnoses such as dementia, chronic obstructive pulmonary disease, or congestive heart failure, the challenges to improve prognostic ability andpredict disease course continue.13-15 Focusing on functional status, goals of care, and comorbidities are keys to helping with prognosis. Given the challenge, we find the PPS, KPS, and ECOG scales important tools.

When prognosticating, we attempt to define quantity and quality of life (which our patients must define independently or from the voice of their surrogate) and their ability to perform daily activities. Quality of life in patients with glioblastoma is progressively and significantly impacted due to the emergence of debilitating neurologic symptoms arising from infiltrative tumor growth into functionally intact brain tissue that restricts and disrupts normal day-to-day activities. However, functional status plays a significant role in helping the hospice team improve its overall prognosis.

 

Conclusions

This case study illustrates the difficulty that comes with prognostication(s) despite a patient's severely morbid disease, history of metastatic prostate cancer, and advanced age. Although a diagnosis may be concerning, documenting a patient’s status using functional scales prior to hospice admission and during the recertification process is helpful in prognostication. Doing so will allow health care professionals to have an accepted medical standard to use regardless how distinct the patient's diagnosis. The expression, “as the disease does not read the textbook,” may serve as a helpful reminder in talking with patients and their families. This is important as most patient’s clinical disease courses are different and having the opportunity to use performance status scales may help improve prognostic skills.

Predicting life expectancy and providing an end-of-life diagnosis in hospice and palliative care is a challenge for most clinicians. Lack of training, limited communication skills, and relationships with patients are all contributing factors. These skills can improve with the use of functional scoring tools in conjunction with the patient’s comorbidities and physical/psychological symptoms. The Palliative Performance Scale (PPS), Karnofsky Performance Scale (KPS), and Eastern Cooperative Oncology Group Performance Status Scale (ECOG) are commonly used functional scoring tools.

 

The PPS measures 5 functional dimensions including ambulation, activity level, ability to administer self-care, oral intake, and level of consciousness.1 It has been shown to be valid for a broad range of palliative care patients, including those with advanced cancer or life-threatening noncancer diagnoses in hospitals or hospice care.2 The scale, measured in 10% increments, runs from 100% (completely functional) to 0% (dead). A PPS ≤ 70% helps meet hospice eligibility criteria.

The KPS evaluates functional impairment and helps with prognostication. Developed in 1948, it evaluates a patient’s functional ability to tolerate chemotherapy, specifically in lung cancer,and has since been validated to predict mortality across older adults and in chronic disease populations.3,4 The KPS is also measured in 10% increments ranging from 100% (completely functional without assistance) to 0% (dead). A KPS ≤ 70% assists with hospice eligibility criteria (Table 1).5

Developed in 1974, the ECOG has been identified as one of the most important functional status tools in adult cancer care.6 It describes a cancer patient’s functional ability, evaluating their ability to care for oneself and participate in daily activities.7 The ECOG is a 6-point scale; patients can receive scores ranging from 0 (fully active) to 5 (dead). An ECOG score of 4 (sometimes 3) is generally supportive of meeting hospice eligibility (Table 2).6

 

 

CASE Presentation

An 80-year-old patient was admitted to the hospice service at the Veterans Affairs Puget Sound Health Care System (VAPSHCS) community living center (CLC) in Tacoma, Washington, from a community-based acute care hospital. His medical history included prostate cancer with metastasis to his pelvis and type 2 diabetes mellitus, which was stable with treatment with oral medication. Six weeks earlier the patient reported a severe frontal headache that was not responding to over-the-counter analgesics. After 2 days with these symptoms, including a ground-level fall without injuries, he presented to the VAPSHCS emergency department (ED) where a complete neurological examination, including magnetic resonance imaging, revealed a left frontoparietal brain lesion that was 4.2 cm × 3.4 cm × 4.2 cm.

The patient experienced a seizure during his ED evaluation and was admitted for treatment. He underwent a craniotomy where most, but not all the lesions were successfully removed. Postoperatively, the patient exhibited right-sided neglect, gait instability, emotional lability, and cognitive communication disorder. The patient completed 15 of 20 planned radiation treatments but declined further radiation or chemotherapy. The patient decided to halt radiation treatments after being informed by the oncology service that the treatments would likely only add 1 to 2 months to his overall survival, which was < 6 months. The patient elected to focus his goals of care on comfort, dignity, and respect at the end of life and accepted recommendations to be placed into end-of-life hospice care. He was then transferred to the VAPSHCS CLC in Tacoma, Washington, for hospice care.

Upon admission, the patient weighed 94 kg, his vital signs were within reference range, and he reported no pain or headaches. His initial laboratory results revealed a 13.2 g/dL hemoglobin, 3.6 g/dL serum albumin, and a 5.5% hemoglobin A1c, all of which fall into a normal reference range. He had a reported ECOG score of 3 and a KPS score of 50% by the transferring medical team. The patient’s medications included scheduled dexamethasone, metformin, senna, levetiracetam, and as-needed midazolam nasal spray for breakthrough seizures. He also had as-needed acetaminophen for pain. He was alert, oriented ×3, and fully ambulatory but continuously used a 4-wheeled walker for safety and gait instability.

After the patient’s first night, the hospice team met with him to discuss his understanding of his health issues. The patient appeared to have low health literacy but told the team, “I know I am dying.” He had completed written advance directives and a Portable Order for Life-Sustaining Treatment indicating that life-sustaining treatments, including cardiopulmonary resuscitation, supplemental mechanical feeding, or intubation, were not to be used to keep him alive.

At his first 90-day recertification, the patient had gained 8 kg and laboratory results revealed a 14.6 g/dL hemoglobin, 3.8 g/dL serum albumin, and a 6.1% hemoglobin A1c. His ECOG score remained at 3, but his KPS score had increased to 60%. The patient exhibited no new neurologic symptoms or seizures and reported no headaches but had 2 ground-level falls without injury. On both occasions the patient chose not to use his walker to go to the bathroom because it was “too far from my bed.” Per VA policy, after discussions with the hospice team, he was recertified for 90 more days of hospice care. At the end of 6 months in CLC, the patient’s weight remained stable, as did his complete blood count and comprehensive medical panel. He had 1 additional noninjurious ground-level fall and again reported no pain and no use of as-needed acetaminophen. His only medical complication was testing positive for COVID-19, but he remained asymptomatic. The patient was graduated from hospice care and referred to a nearby non-VA adult family home in the community after 180 days. At that time his ECOG score was 2 and his KPS score had increased to 70%.

 

 

DISCUSSION

Primary brain tumors account for about 2% of all malignant neoplasms in adults. About half of them represent gliomas. Glioblastoma multiforme derived from neuroepithelial cells is the most frequent and deadly primary malignant central nervous system tumor in adults.8 About 50% of patients with glioblastomas are aged ≥ 65 years at diagnosis.9 A retrospective study of Centers for Medicare and Medicaid Services claims data paired with the Surveillance, Epidemiology, and End Results database indicated a median survival of 4 months for patients with glioblastoma multiforme aged > 65 years, including all treatment modalities.10 Surgical resection combined with radiation and chemotherapy offers the best prognosis for the preservation of neurologic function.11 However, comorbidities, adverse drug effects, and the potential for postoperative complications pose significant risks, especially for older patients. Ultimately, goals of care conversations and advance directives play a very important role in evaluating benefits vs risks with this malignancy.

Our patient was aged 80 years and had previously been diagnosed with metastatic prostate malignancy. His goals of care focused on spending time with his friends, leaving his room to eat in the facility dining area, and continuing his daily walks. He remained clear that he did not want his care team to institute life-sustaining treatments to be kept alive and felt the information regarding the risks vs benefits of accepting chemotherapy was not aligned with his goals of care. Over the 6 months that he received hospice care, he gained weight, improved his hemoglobin and serum albumin levels, and ambulated with the use of a 4-wheeled walker. As the patient exhibited no functional decline or new comorbidities and his functional status improved, the clinical staff felt he no longer needed hospice services. The patient had an ECOG score of 2 and a KPS score of 70% at his hospice graduation.

Medical prognostication is one of the biggest challenges clinicians face. Clinicians are generally “over prognosticators,” and their thoughts tend to be based on the patient relationship, overall experiences in health care, and desire to treat and cure patients.12 In hospice we are asked to define the usual, normal, or expected course of a disease, but what does that mean? Although metastatic malignancies usually have a predictable course in comparison to diagnoses such as dementia, chronic obstructive pulmonary disease, or congestive heart failure, the challenges to improve prognostic ability andpredict disease course continue.13-15 Focusing on functional status, goals of care, and comorbidities are keys to helping with prognosis. Given the challenge, we find the PPS, KPS, and ECOG scales important tools.

When prognosticating, we attempt to define quantity and quality of life (which our patients must define independently or from the voice of their surrogate) and their ability to perform daily activities. Quality of life in patients with glioblastoma is progressively and significantly impacted due to the emergence of debilitating neurologic symptoms arising from infiltrative tumor growth into functionally intact brain tissue that restricts and disrupts normal day-to-day activities. However, functional status plays a significant role in helping the hospice team improve its overall prognosis.

 

Conclusions

This case study illustrates the difficulty that comes with prognostication(s) despite a patient's severely morbid disease, history of metastatic prostate cancer, and advanced age. Although a diagnosis may be concerning, documenting a patient’s status using functional scales prior to hospice admission and during the recertification process is helpful in prognostication. Doing so will allow health care professionals to have an accepted medical standard to use regardless how distinct the patient's diagnosis. The expression, “as the disease does not read the textbook,” may serve as a helpful reminder in talking with patients and their families. This is important as most patient’s clinical disease courses are different and having the opportunity to use performance status scales may help improve prognostic skills.

References

1. Cleary TA. The Palliative Performance Scale (PPSv2) Version 2. In: Downing GM, ed. Medical Care of the Dying. 4th ed. Victoria Hospice Society, Learning Centre for Palliative Care; 2006:120.

2. Palliative Performance Scale. ePrognosis, University of California San Francisco. Accessed June 14, 2024. https://eprognosis.ucsf.edu/pps.php

3. Karnofsky DA, Burchenal JH. The Clinical Evaluation of Chemotherapeutic Agents in Cancer. In: MacLeod CM, ed. Evaluation of Chemotherapeutic Agents. Columbia University Press; 1949:191-205.

4. Khalid MA, Achakzai IK, Ahmed Khan S, et al. The use of Karnofsky Performance Status (KPS) as a predictor of 3 month post discharge mortality in cirrhotic patients. Gastroenterol Hepatol Bed Bench. 2018;11(4):301-305.

5. Karnofsky Performance Scale. US Dept of Veterans Affairs. Accessed June 14, 2024. https://www.hiv.va.gov/provider/tools/karnofsky-performance-scale.asp

6. Mischel A-M, Rosielle DA. Eastern Cooperative Oncology Group Performance Status. Palliative Care Network of Wisconsin. December 10, 2021. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/eastern-cooperative-oncology-group-performance-status/

7. Oken MM, Creech RH, Tormey DC, et al. Toxicity and response criteria of the Eastern Cooperative Oncology Group. Am J Clin Oncol. 1982;5(6):649-655.

8. Nizamutdinov D, Stock EM, Dandashi JA, et al. Prognostication of survival outcomes in patients diagnosed with glioblastoma. World Neurosurg. 2018;109:e67-e74. doi:10.1016/j.wneu.2017.09.104

9. Kita D, Ciernik IFVaccarella S, et al. Age as a predictive factor in glioblastomas: population-based study. Neuroepidemiology. 2009;33(1):17-22. doi:10.1159/000210017

10. Jordan JT, Gerstner ER, Batchelor TT, Cahill DP, Plotkin SR. Glioblastoma care in the elderly. Cancer. 2016;122(2):189-197. doi:10.1002/cnr.29742

11. Brown, NF, Ottaviani D, Tazare J, et al. Survival outcomes and prognostic factors in glioblastoma. Cancers (Basel). 2022;14(13):3161. doi:10.3390/cancers14133161

12. Christalakis NA. Death Foretold: Prophecy and Prognosis in Medical Care. University of Chicago Press; 2000.

13. Weissman DE. Determining Prognosis in Advanced Cancer. Palliative Care Network of Wisconsin. January 28, 2019. Accessed June 14, 2014. https://www.mypcnow.org/fast-fact/determining-prognosis-in-advanced-cancer/

14. Childers JW, Arnold R, Curtis JR. Prognosis in End-Stage COPD. Palliative Care Network of Wisconsin. February 11, 2019. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/prognosis-in-end-stage-copd/

15. Reisfield GM, Wilson GR. Prognostication in Heart Failure. Palliative Care Network of Wisconsin. February 11, 2019. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/prognostication-in-heart-failure/

References

1. Cleary TA. The Palliative Performance Scale (PPSv2) Version 2. In: Downing GM, ed. Medical Care of the Dying. 4th ed. Victoria Hospice Society, Learning Centre for Palliative Care; 2006:120.

2. Palliative Performance Scale. ePrognosis, University of California San Francisco. Accessed June 14, 2024. https://eprognosis.ucsf.edu/pps.php

3. Karnofsky DA, Burchenal JH. The Clinical Evaluation of Chemotherapeutic Agents in Cancer. In: MacLeod CM, ed. Evaluation of Chemotherapeutic Agents. Columbia University Press; 1949:191-205.

4. Khalid MA, Achakzai IK, Ahmed Khan S, et al. The use of Karnofsky Performance Status (KPS) as a predictor of 3 month post discharge mortality in cirrhotic patients. Gastroenterol Hepatol Bed Bench. 2018;11(4):301-305.

5. Karnofsky Performance Scale. US Dept of Veterans Affairs. Accessed June 14, 2024. https://www.hiv.va.gov/provider/tools/karnofsky-performance-scale.asp

6. Mischel A-M, Rosielle DA. Eastern Cooperative Oncology Group Performance Status. Palliative Care Network of Wisconsin. December 10, 2021. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/eastern-cooperative-oncology-group-performance-status/

7. Oken MM, Creech RH, Tormey DC, et al. Toxicity and response criteria of the Eastern Cooperative Oncology Group. Am J Clin Oncol. 1982;5(6):649-655.

8. Nizamutdinov D, Stock EM, Dandashi JA, et al. Prognostication of survival outcomes in patients diagnosed with glioblastoma. World Neurosurg. 2018;109:e67-e74. doi:10.1016/j.wneu.2017.09.104

9. Kita D, Ciernik IFVaccarella S, et al. Age as a predictive factor in glioblastomas: population-based study. Neuroepidemiology. 2009;33(1):17-22. doi:10.1159/000210017

10. Jordan JT, Gerstner ER, Batchelor TT, Cahill DP, Plotkin SR. Glioblastoma care in the elderly. Cancer. 2016;122(2):189-197. doi:10.1002/cnr.29742

11. Brown, NF, Ottaviani D, Tazare J, et al. Survival outcomes and prognostic factors in glioblastoma. Cancers (Basel). 2022;14(13):3161. doi:10.3390/cancers14133161

12. Christalakis NA. Death Foretold: Prophecy and Prognosis in Medical Care. University of Chicago Press; 2000.

13. Weissman DE. Determining Prognosis in Advanced Cancer. Palliative Care Network of Wisconsin. January 28, 2019. Accessed June 14, 2014. https://www.mypcnow.org/fast-fact/determining-prognosis-in-advanced-cancer/

14. Childers JW, Arnold R, Curtis JR. Prognosis in End-Stage COPD. Palliative Care Network of Wisconsin. February 11, 2019. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/prognosis-in-end-stage-copd/

15. Reisfield GM, Wilson GR. Prognostication in Heart Failure. Palliative Care Network of Wisconsin. February 11, 2019. Accessed June 14, 2024. https://www.mypcnow.org/fast-fact/prognostication-in-heart-failure/

Issue
Federal Practitioner - 41(8)s
Issue
Federal Practitioner - 41(8)s
Page Number
S50-S53
Page Number
S50-S53
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

Risk Stratification May Work Well for FIT-Based CRC Screening in Elderly

Article Type
Changed
Wed, 08/07/2024 - 14:59

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Statins, Vitamin D, and Exercise in Older Adults

Article Type
Changed
Mon, 07/29/2024 - 15:09

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

Publications
Topics
Sections

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Dermatoporosis in Older Adults: A Condition That Requires Holistic, Creative Management

Article Type
Changed
Tue, 07/23/2024 - 12:19

The chronic, excessive fragility of aging and sun-damaged skin has a name in the medical literature: dermatoporosis. This identification is helpful because it validates patients’ suffering and conveys the skin’s vulnerability to serious medical complications, said Adam Friedman, MD, at the ElderDerm conference on dermatology in the older patient.

Key features of dermatoporosis include atrophic skin, solar purpura, white pseudoscars, easily acquired skin lacerations and tears, bruises, and delayed healing. “We’re going to see more of this, and it will more and more be a chief complaint of patients,” said Dr. Friedman, professor and chair of dermatology at George Washington University (GWU) in Washington, and co-chair of the meeting. GWU hosted the conference, describing it as a first-of-its-kind meeting dedicated to improving dermatologic care for older adults.

Dr. Adam Friedman, professor and interim chief of dermatology, George Washington University, Washington
Dr. Adam Friedman


Dermatoporosis was described in the literature in 2007 by dermatologists at the University of Geneva in Switzerland. “It is not only a cosmetic problem,” Dr. Friedman said. “This is a medical problem ... which can absolutely lead to comorbidities [such as deep dissecting hematomas] that are a huge strain on the healthcare system.”

Dermatologists can meet the moment with holistic, creative combination treatment and counseling approaches aimed at improving the mechanical strength of skin and preventing potential complications in older patients, Dr. Friedman said at the meeting.

He described the case of a 76-year-old woman who presented with dermatoporosis on her arms involving pronounced skin atrophy, solar purpura, and a small covered laceration. “This was a patient who was both devastated by the appearance” and impacted by the pain and burden of dressing frequent wounds, said Dr. Friedman, who is also the director of the Residency Program, of Translational Research, and of Supportive Oncodermatology, all within the Department of Dermatology at GWU.

With 11 months of topical treatment that included daily application of calcipotriene 0.05% ointment and nightly application of tazarotene 0.045% lotion and oral supplementation with 1000-mg vitamin C twice daily and 1000-mg citrus bioflavonoid complex daily, as well as no changes to the medications she took for various comorbidities, the solar purpura improved significantly and “we made a huge difference in the integrity of her skin,” he said. 

Dr. Friedman also described this case in a recently published article in the Journal of Drugs in Dermatology titled “What’s Old Is New: An Emerging Focus on Dermatoporosis”. 
 

Likely Pathophysiology

Advancing age and chronic ultraviolet (UV) radiation exposure are the chief drivers of dermatoporosis. In addition to UVA and UVB light, other secondary drivers include genetic susceptibility, topical and systematic corticosteroid use, and anticoagulant treatment.

Its pathogenesis is not well described in the literature but is easy to envision, Dr. Friedman said. For one, both advancing age and exposure to UV light lead to a reduction in hygroscopic glycosaminoglycans, including hyaluronate (HA), and the impact of this diminishment is believed to go “beyond [the loss of] buoyancy,” he noted. Researchers have “been showing these are not just water-loving molecules, they also have some biologic properties” relating to keratinocyte production and epidermal turnover that appear to be intricately linked to the pathogenesis of dermatoporosis. 

HAs have been shown to interact with the cell surface receptor CD44 to stimulate keratinocyte proliferation, and low levels of CD44 have been reported in skin with dermatoporosis compared with a younger control population. (A newly characterized organelle, the hyaluronosome, serves as an HA factory and contains CD44 and heparin-binding epidermal growth factor, Dr. Friedman noted. Inadequate functioning may be involved in skin atrophy.) 

Advancing age also brings an increase in matrix metalloproteinases (MMPs)–1, –2, and –3, which are “the demolition workers of the skin,” and downregulation of a tissue inhibitor of MMPs, he said. 

Adding insult to injury, dermis-penetrating UVA also activates MMPs, “obliterating collagen and elastin.” UVB generates DNA photoproducts, including oxidative stress and damaging skin cell DNA. “That UV light induces breakdown [of the skin] through different mechanisms and inhibits buildup is a simple concept I think our patients can understand,” Dr. Friedman said.
 

 

 

Multifaceted Treatment

For an older adult, “there is never a wrong time to start sun-protective measures” to prevent or try to halt the progression of dermatoporosis, Dr. Friedman said, noting that “UV radiation is an immunosuppressant, so there are many good reasons to start” if the adult is not already taking measures on a regular basis.

Potential treatments for the syndrome of dermatoporosis are backed by few clinical studies, but dermatologists are skilled at translating the use of products from one disease state to another based on understandings of pathophysiology and mechanistic pathways, Dr. Friedman commented in an interview after the meeting. 

For instance, “from decades of research, we know what retinoids will do to the skin,” he said in the interview. “We know they will turn on collagen-1 and -3 genes in the skin, and that they will increase the production of glycosaminoglycans ... By understanding the biology, we can translate this to dermatoporosis.” These changes were demonstrated, for instance, in a small study of topical retinol in older adults.

Studies of topical alpha hydroxy acid (AHA), moreover, have demonstrated epidermal thickening and firmness, and “some studies show they can limit steroid-induced atrophy,” Dr. Friedman said at the meeting. “And things like lactic acid and urea are super accessible.”

Topical dehydroepiandrosterone is backed by even less data than retinoids or AHAs are, “but it’s still something to consider” as part of a multimechanistic approach to dermatoporosis, Dr. Friedman shared, noting that a small study demonstrated beneficial effects on epidermal atrophy in aging skin. 

The use of vitamin D analogues such as calcipotriene, which is approved for the treatment of psoriasis, may also be promising. “One concept is that [vitamin D analogues] increase calcium concentrations in the epidermis, and calcium is so central to keratinocyte differentiation” and epidermal function that calcipotriene in combination with topical steroid therapy has been shown to limit skin atrophy, he noted.

Nutritionally, low protein intake is a known problem in the older population and is associated with increased skin fragility and poorer healing. From a prevention and treatment standpoint, therefore, patients can be counseled to be attentive to their diets, Dr. Friedman said. Experts have recommended a higher protein intake for older adults than for younger adults; in 2013, an international group recommended a protein intake of 1-1.5 g/kg/d for healthy older adults and more for those with acute or chronic illness.

“Patients love talking about diet and skin disease ... and they love over-the-counter nutraceuticals as well because they want something natural,” Dr. Friedman said. “I like using bioflavonoids in combination with vitamin C, which can be effective especially for solar purpura.”

Actinic senile purpura, a common feature of dermatoporosis
Courtesy Dr. Adam Friedman
Actinic senile purpura, a common feature of dermatoporosis


A 6-week randomized, placebo-controlled, double-blind trial involving 67 patients with purpura associated with aging found a 50% reduction in purpura lesions among those took a particular citrus bioflavonoid blend twice daily. “I thought this was a pretty well-done study,” he said, noting that patient self-assessment and investigator global assessment were utilized.
 

 

 

Skin Injury and Wound Prevention

In addition to recommending gentle skin cleansers and daily moisturizing, dermatologists should talk to their older patients with dermatoporosis about their home environments. “What is it like? Is there furniture with sharp edges?” Dr. Friedman advised. If so, could they use sleeves or protectors on their arms or legs “to protect against injury?”

In a later meeting session about lower-extremity wounds on geriatric patients, Michael Stempel, DPM, assistant professor of medicine and surgery and chief of podiatry at GWU, said that he was happy to hear the term dermatoporosis being used because like diabetes, it’s a risk factor for developing lower-extremity wounds and poor wound healing. 

He shared the case of an older woman with dermatoporosis who “tripped and skinned her knee against a step and then self-treated it for over a month by pouring hydrogen peroxide over it and letting air get to it.” The wound developed into “full-thickness tissue loss,” said Dr. Stempel, also medical director of the Wound Healing and Limb Preservation Center at GWU Hospital. 

Misperceptions are common among older patients about how a simple wound should be managed; for instance, the adage “just let it get air” is not uncommon. This makes anticipatory guidance about basic wound care — such as the importance of a moist and occlusive environment and the safe use of hydrogen peroxide — especially important for patients with dermatoporosis, Dr. Friedman commented after the meeting.



Dermatoporosis is quantifiable, Dr. Friedman said during the meeting, with a scoring system having been developed by the researchers in Switzerland who originally coined the term. Its use in practice is unnecessary, but its existence is “nice to share with patients who feel bothered because oftentimes, patients feel it’s been dismissed by other providers,” he said. “Telling your patients there’s an actual name for their problem, and that there are ways to quantify and measure changes over time, is validating.” 

Its recognition as a medical condition, Dr. Friedman added, also enables the dermatologist to bring it up and counsel appropriately — without a patient feeling shame — when it is identified in the context of a skin excision, treatment of a primary inflammatory skin disease, or management of another dermatologic problem.

Dr. Friedman disclosed that he is a consultant/advisory board member for L’Oréal, La Roche-Posay, Galderma, and other companies; a speaker for Regeneron/Sanofi, Incyte, BMD, and Janssen; and has grants from Pfizer, Lilly, Incyte, and other companies. Dr. Stempel reported no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The chronic, excessive fragility of aging and sun-damaged skin has a name in the medical literature: dermatoporosis. This identification is helpful because it validates patients’ suffering and conveys the skin’s vulnerability to serious medical complications, said Adam Friedman, MD, at the ElderDerm conference on dermatology in the older patient.

Key features of dermatoporosis include atrophic skin, solar purpura, white pseudoscars, easily acquired skin lacerations and tears, bruises, and delayed healing. “We’re going to see more of this, and it will more and more be a chief complaint of patients,” said Dr. Friedman, professor and chair of dermatology at George Washington University (GWU) in Washington, and co-chair of the meeting. GWU hosted the conference, describing it as a first-of-its-kind meeting dedicated to improving dermatologic care for older adults.

Dr. Adam Friedman, professor and interim chief of dermatology, George Washington University, Washington
Dr. Adam Friedman


Dermatoporosis was described in the literature in 2007 by dermatologists at the University of Geneva in Switzerland. “It is not only a cosmetic problem,” Dr. Friedman said. “This is a medical problem ... which can absolutely lead to comorbidities [such as deep dissecting hematomas] that are a huge strain on the healthcare system.”

Dermatologists can meet the moment with holistic, creative combination treatment and counseling approaches aimed at improving the mechanical strength of skin and preventing potential complications in older patients, Dr. Friedman said at the meeting.

He described the case of a 76-year-old woman who presented with dermatoporosis on her arms involving pronounced skin atrophy, solar purpura, and a small covered laceration. “This was a patient who was both devastated by the appearance” and impacted by the pain and burden of dressing frequent wounds, said Dr. Friedman, who is also the director of the Residency Program, of Translational Research, and of Supportive Oncodermatology, all within the Department of Dermatology at GWU.

With 11 months of topical treatment that included daily application of calcipotriene 0.05% ointment and nightly application of tazarotene 0.045% lotion and oral supplementation with 1000-mg vitamin C twice daily and 1000-mg citrus bioflavonoid complex daily, as well as no changes to the medications she took for various comorbidities, the solar purpura improved significantly and “we made a huge difference in the integrity of her skin,” he said. 

Dr. Friedman also described this case in a recently published article in the Journal of Drugs in Dermatology titled “What’s Old Is New: An Emerging Focus on Dermatoporosis”. 
 

Likely Pathophysiology

Advancing age and chronic ultraviolet (UV) radiation exposure are the chief drivers of dermatoporosis. In addition to UVA and UVB light, other secondary drivers include genetic susceptibility, topical and systematic corticosteroid use, and anticoagulant treatment.

Its pathogenesis is not well described in the literature but is easy to envision, Dr. Friedman said. For one, both advancing age and exposure to UV light lead to a reduction in hygroscopic glycosaminoglycans, including hyaluronate (HA), and the impact of this diminishment is believed to go “beyond [the loss of] buoyancy,” he noted. Researchers have “been showing these are not just water-loving molecules, they also have some biologic properties” relating to keratinocyte production and epidermal turnover that appear to be intricately linked to the pathogenesis of dermatoporosis. 

HAs have been shown to interact with the cell surface receptor CD44 to stimulate keratinocyte proliferation, and low levels of CD44 have been reported in skin with dermatoporosis compared with a younger control population. (A newly characterized organelle, the hyaluronosome, serves as an HA factory and contains CD44 and heparin-binding epidermal growth factor, Dr. Friedman noted. Inadequate functioning may be involved in skin atrophy.) 

Advancing age also brings an increase in matrix metalloproteinases (MMPs)–1, –2, and –3, which are “the demolition workers of the skin,” and downregulation of a tissue inhibitor of MMPs, he said. 

Adding insult to injury, dermis-penetrating UVA also activates MMPs, “obliterating collagen and elastin.” UVB generates DNA photoproducts, including oxidative stress and damaging skin cell DNA. “That UV light induces breakdown [of the skin] through different mechanisms and inhibits buildup is a simple concept I think our patients can understand,” Dr. Friedman said.
 

 

 

Multifaceted Treatment

For an older adult, “there is never a wrong time to start sun-protective measures” to prevent or try to halt the progression of dermatoporosis, Dr. Friedman said, noting that “UV radiation is an immunosuppressant, so there are many good reasons to start” if the adult is not already taking measures on a regular basis.

Potential treatments for the syndrome of dermatoporosis are backed by few clinical studies, but dermatologists are skilled at translating the use of products from one disease state to another based on understandings of pathophysiology and mechanistic pathways, Dr. Friedman commented in an interview after the meeting. 

For instance, “from decades of research, we know what retinoids will do to the skin,” he said in the interview. “We know they will turn on collagen-1 and -3 genes in the skin, and that they will increase the production of glycosaminoglycans ... By understanding the biology, we can translate this to dermatoporosis.” These changes were demonstrated, for instance, in a small study of topical retinol in older adults.

Studies of topical alpha hydroxy acid (AHA), moreover, have demonstrated epidermal thickening and firmness, and “some studies show they can limit steroid-induced atrophy,” Dr. Friedman said at the meeting. “And things like lactic acid and urea are super accessible.”

Topical dehydroepiandrosterone is backed by even less data than retinoids or AHAs are, “but it’s still something to consider” as part of a multimechanistic approach to dermatoporosis, Dr. Friedman shared, noting that a small study demonstrated beneficial effects on epidermal atrophy in aging skin. 

The use of vitamin D analogues such as calcipotriene, which is approved for the treatment of psoriasis, may also be promising. “One concept is that [vitamin D analogues] increase calcium concentrations in the epidermis, and calcium is so central to keratinocyte differentiation” and epidermal function that calcipotriene in combination with topical steroid therapy has been shown to limit skin atrophy, he noted.

Nutritionally, low protein intake is a known problem in the older population and is associated with increased skin fragility and poorer healing. From a prevention and treatment standpoint, therefore, patients can be counseled to be attentive to their diets, Dr. Friedman said. Experts have recommended a higher protein intake for older adults than for younger adults; in 2013, an international group recommended a protein intake of 1-1.5 g/kg/d for healthy older adults and more for those with acute or chronic illness.

“Patients love talking about diet and skin disease ... and they love over-the-counter nutraceuticals as well because they want something natural,” Dr. Friedman said. “I like using bioflavonoids in combination with vitamin C, which can be effective especially for solar purpura.”

Actinic senile purpura, a common feature of dermatoporosis
Courtesy Dr. Adam Friedman
Actinic senile purpura, a common feature of dermatoporosis


A 6-week randomized, placebo-controlled, double-blind trial involving 67 patients with purpura associated with aging found a 50% reduction in purpura lesions among those took a particular citrus bioflavonoid blend twice daily. “I thought this was a pretty well-done study,” he said, noting that patient self-assessment and investigator global assessment were utilized.
 

 

 

Skin Injury and Wound Prevention

In addition to recommending gentle skin cleansers and daily moisturizing, dermatologists should talk to their older patients with dermatoporosis about their home environments. “What is it like? Is there furniture with sharp edges?” Dr. Friedman advised. If so, could they use sleeves or protectors on their arms or legs “to protect against injury?”

In a later meeting session about lower-extremity wounds on geriatric patients, Michael Stempel, DPM, assistant professor of medicine and surgery and chief of podiatry at GWU, said that he was happy to hear the term dermatoporosis being used because like diabetes, it’s a risk factor for developing lower-extremity wounds and poor wound healing. 

He shared the case of an older woman with dermatoporosis who “tripped and skinned her knee against a step and then self-treated it for over a month by pouring hydrogen peroxide over it and letting air get to it.” The wound developed into “full-thickness tissue loss,” said Dr. Stempel, also medical director of the Wound Healing and Limb Preservation Center at GWU Hospital. 

Misperceptions are common among older patients about how a simple wound should be managed; for instance, the adage “just let it get air” is not uncommon. This makes anticipatory guidance about basic wound care — such as the importance of a moist and occlusive environment and the safe use of hydrogen peroxide — especially important for patients with dermatoporosis, Dr. Friedman commented after the meeting.



Dermatoporosis is quantifiable, Dr. Friedman said during the meeting, with a scoring system having been developed by the researchers in Switzerland who originally coined the term. Its use in practice is unnecessary, but its existence is “nice to share with patients who feel bothered because oftentimes, patients feel it’s been dismissed by other providers,” he said. “Telling your patients there’s an actual name for their problem, and that there are ways to quantify and measure changes over time, is validating.” 

Its recognition as a medical condition, Dr. Friedman added, also enables the dermatologist to bring it up and counsel appropriately — without a patient feeling shame — when it is identified in the context of a skin excision, treatment of a primary inflammatory skin disease, or management of another dermatologic problem.

Dr. Friedman disclosed that he is a consultant/advisory board member for L’Oréal, La Roche-Posay, Galderma, and other companies; a speaker for Regeneron/Sanofi, Incyte, BMD, and Janssen; and has grants from Pfizer, Lilly, Incyte, and other companies. Dr. Stempel reported no disclosures.

A version of this article first appeared on Medscape.com.

The chronic, excessive fragility of aging and sun-damaged skin has a name in the medical literature: dermatoporosis. This identification is helpful because it validates patients’ suffering and conveys the skin’s vulnerability to serious medical complications, said Adam Friedman, MD, at the ElderDerm conference on dermatology in the older patient.

Key features of dermatoporosis include atrophic skin, solar purpura, white pseudoscars, easily acquired skin lacerations and tears, bruises, and delayed healing. “We’re going to see more of this, and it will more and more be a chief complaint of patients,” said Dr. Friedman, professor and chair of dermatology at George Washington University (GWU) in Washington, and co-chair of the meeting. GWU hosted the conference, describing it as a first-of-its-kind meeting dedicated to improving dermatologic care for older adults.

Dr. Adam Friedman, professor and interim chief of dermatology, George Washington University, Washington
Dr. Adam Friedman


Dermatoporosis was described in the literature in 2007 by dermatologists at the University of Geneva in Switzerland. “It is not only a cosmetic problem,” Dr. Friedman said. “This is a medical problem ... which can absolutely lead to comorbidities [such as deep dissecting hematomas] that are a huge strain on the healthcare system.”

Dermatologists can meet the moment with holistic, creative combination treatment and counseling approaches aimed at improving the mechanical strength of skin and preventing potential complications in older patients, Dr. Friedman said at the meeting.

He described the case of a 76-year-old woman who presented with dermatoporosis on her arms involving pronounced skin atrophy, solar purpura, and a small covered laceration. “This was a patient who was both devastated by the appearance” and impacted by the pain and burden of dressing frequent wounds, said Dr. Friedman, who is also the director of the Residency Program, of Translational Research, and of Supportive Oncodermatology, all within the Department of Dermatology at GWU.

With 11 months of topical treatment that included daily application of calcipotriene 0.05% ointment and nightly application of tazarotene 0.045% lotion and oral supplementation with 1000-mg vitamin C twice daily and 1000-mg citrus bioflavonoid complex daily, as well as no changes to the medications she took for various comorbidities, the solar purpura improved significantly and “we made a huge difference in the integrity of her skin,” he said. 

Dr. Friedman also described this case in a recently published article in the Journal of Drugs in Dermatology titled “What’s Old Is New: An Emerging Focus on Dermatoporosis”. 
 

Likely Pathophysiology

Advancing age and chronic ultraviolet (UV) radiation exposure are the chief drivers of dermatoporosis. In addition to UVA and UVB light, other secondary drivers include genetic susceptibility, topical and systematic corticosteroid use, and anticoagulant treatment.

Its pathogenesis is not well described in the literature but is easy to envision, Dr. Friedman said. For one, both advancing age and exposure to UV light lead to a reduction in hygroscopic glycosaminoglycans, including hyaluronate (HA), and the impact of this diminishment is believed to go “beyond [the loss of] buoyancy,” he noted. Researchers have “been showing these are not just water-loving molecules, they also have some biologic properties” relating to keratinocyte production and epidermal turnover that appear to be intricately linked to the pathogenesis of dermatoporosis. 

HAs have been shown to interact with the cell surface receptor CD44 to stimulate keratinocyte proliferation, and low levels of CD44 have been reported in skin with dermatoporosis compared with a younger control population. (A newly characterized organelle, the hyaluronosome, serves as an HA factory and contains CD44 and heparin-binding epidermal growth factor, Dr. Friedman noted. Inadequate functioning may be involved in skin atrophy.) 

Advancing age also brings an increase in matrix metalloproteinases (MMPs)–1, –2, and –3, which are “the demolition workers of the skin,” and downregulation of a tissue inhibitor of MMPs, he said. 

Adding insult to injury, dermis-penetrating UVA also activates MMPs, “obliterating collagen and elastin.” UVB generates DNA photoproducts, including oxidative stress and damaging skin cell DNA. “That UV light induces breakdown [of the skin] through different mechanisms and inhibits buildup is a simple concept I think our patients can understand,” Dr. Friedman said.
 

 

 

Multifaceted Treatment

For an older adult, “there is never a wrong time to start sun-protective measures” to prevent or try to halt the progression of dermatoporosis, Dr. Friedman said, noting that “UV radiation is an immunosuppressant, so there are many good reasons to start” if the adult is not already taking measures on a regular basis.

Potential treatments for the syndrome of dermatoporosis are backed by few clinical studies, but dermatologists are skilled at translating the use of products from one disease state to another based on understandings of pathophysiology and mechanistic pathways, Dr. Friedman commented in an interview after the meeting. 

For instance, “from decades of research, we know what retinoids will do to the skin,” he said in the interview. “We know they will turn on collagen-1 and -3 genes in the skin, and that they will increase the production of glycosaminoglycans ... By understanding the biology, we can translate this to dermatoporosis.” These changes were demonstrated, for instance, in a small study of topical retinol in older adults.

Studies of topical alpha hydroxy acid (AHA), moreover, have demonstrated epidermal thickening and firmness, and “some studies show they can limit steroid-induced atrophy,” Dr. Friedman said at the meeting. “And things like lactic acid and urea are super accessible.”

Topical dehydroepiandrosterone is backed by even less data than retinoids or AHAs are, “but it’s still something to consider” as part of a multimechanistic approach to dermatoporosis, Dr. Friedman shared, noting that a small study demonstrated beneficial effects on epidermal atrophy in aging skin. 

The use of vitamin D analogues such as calcipotriene, which is approved for the treatment of psoriasis, may also be promising. “One concept is that [vitamin D analogues] increase calcium concentrations in the epidermis, and calcium is so central to keratinocyte differentiation” and epidermal function that calcipotriene in combination with topical steroid therapy has been shown to limit skin atrophy, he noted.

Nutritionally, low protein intake is a known problem in the older population and is associated with increased skin fragility and poorer healing. From a prevention and treatment standpoint, therefore, patients can be counseled to be attentive to their diets, Dr. Friedman said. Experts have recommended a higher protein intake for older adults than for younger adults; in 2013, an international group recommended a protein intake of 1-1.5 g/kg/d for healthy older adults and more for those with acute or chronic illness.

“Patients love talking about diet and skin disease ... and they love over-the-counter nutraceuticals as well because they want something natural,” Dr. Friedman said. “I like using bioflavonoids in combination with vitamin C, which can be effective especially for solar purpura.”

Actinic senile purpura, a common feature of dermatoporosis
Courtesy Dr. Adam Friedman
Actinic senile purpura, a common feature of dermatoporosis


A 6-week randomized, placebo-controlled, double-blind trial involving 67 patients with purpura associated with aging found a 50% reduction in purpura lesions among those took a particular citrus bioflavonoid blend twice daily. “I thought this was a pretty well-done study,” he said, noting that patient self-assessment and investigator global assessment were utilized.
 

 

 

Skin Injury and Wound Prevention

In addition to recommending gentle skin cleansers and daily moisturizing, dermatologists should talk to their older patients with dermatoporosis about their home environments. “What is it like? Is there furniture with sharp edges?” Dr. Friedman advised. If so, could they use sleeves or protectors on their arms or legs “to protect against injury?”

In a later meeting session about lower-extremity wounds on geriatric patients, Michael Stempel, DPM, assistant professor of medicine and surgery and chief of podiatry at GWU, said that he was happy to hear the term dermatoporosis being used because like diabetes, it’s a risk factor for developing lower-extremity wounds and poor wound healing. 

He shared the case of an older woman with dermatoporosis who “tripped and skinned her knee against a step and then self-treated it for over a month by pouring hydrogen peroxide over it and letting air get to it.” The wound developed into “full-thickness tissue loss,” said Dr. Stempel, also medical director of the Wound Healing and Limb Preservation Center at GWU Hospital. 

Misperceptions are common among older patients about how a simple wound should be managed; for instance, the adage “just let it get air” is not uncommon. This makes anticipatory guidance about basic wound care — such as the importance of a moist and occlusive environment and the safe use of hydrogen peroxide — especially important for patients with dermatoporosis, Dr. Friedman commented after the meeting.



Dermatoporosis is quantifiable, Dr. Friedman said during the meeting, with a scoring system having been developed by the researchers in Switzerland who originally coined the term. Its use in practice is unnecessary, but its existence is “nice to share with patients who feel bothered because oftentimes, patients feel it’s been dismissed by other providers,” he said. “Telling your patients there’s an actual name for their problem, and that there are ways to quantify and measure changes over time, is validating.” 

Its recognition as a medical condition, Dr. Friedman added, also enables the dermatologist to bring it up and counsel appropriately — without a patient feeling shame — when it is identified in the context of a skin excision, treatment of a primary inflammatory skin disease, or management of another dermatologic problem.

Dr. Friedman disclosed that he is a consultant/advisory board member for L’Oréal, La Roche-Posay, Galderma, and other companies; a speaker for Regeneron/Sanofi, Incyte, BMD, and Janssen; and has grants from Pfizer, Lilly, Incyte, and other companies. Dr. Stempel reported no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ELDERDERM 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Managing Atopic Dermatitis in Older Adults: A Common, Unique Challenge

Article Type
Changed
Tue, 07/23/2024 - 11:11

WASHINGTON, DC — The onset of atopic dermatitis (AD) in older adulthood — even in adults aged ≥ 90 years — is a phenomenon documented in the literature in recent years, with reports showing age-related immune differences and differences in risk factors, Jonathan I. Silverberg, MD, PhD, MPH, said at the ElderDerm Conference on dermatology in the older patient hosted by the George Washington University School of Medicine and Health Sciences, Washington, DC. 

“I walked out of residency under the impression that if it didn’t start in the first year or two of life, it’s not AD,” said Dr. Silverberg, professor of dermatology and director of clinical research at George Washington University. “The numbers tell us a very different story.” 

Dr. Jonathan I. Silverberg, professor of dermatology at George Washington University, Washington, DC
Dr. Silverberg
Dr. Jonathan I. Silverberg

The prevalence of AD in the United States fluctuates between 6% and 8% through adulthood, including age categories up to 81-85 years, according to 2012 National Health Interview Survey data. And while persistence of childhood-onset AD is common, a systematic review and meta-analysis published in 2018 concluded that one in four adults with AD report adult-onset disease. 

The investigators, including Dr. Silverberg, identified 25 observational studies — studies conducted across 16 countries and published during 1956-2017 — that included an analysis of age of onset beyond 10 years of age, and other inclusion criteria. Of the 25 studies, 17 reported age of onset after 16 years of age and had sufficient data for the meta-analysis. Using random-effects weighting, the investigators found a pooled proportion of adult-onset AD of 26.1% (95% CI, 16.5%-37.2%).

The research demonstrates that “the age of onset is distributed well throughout the lifespan,” Dr. Silverberg said, with the data “indicating there are many elderly-onset cases of true AD as well.” (Thirteen of the studies analyzed an age of onset from age ≥ 65, and several looked beyond age 80). 

A 2021 study of a primary care database in the United Kingdom of 3.85 million children and adults found a “fascinating” bimodal distribution of incidence across the lifespan, with peaks in both infancy and older adulthood, he said. Incidence in adulthood was relatively stable from ages 18-49 years, after which, “into the 50s, 60s and beyond, you started to see a steady climb again.” 

Also intriguing, Dr. Silverberg continued, are findings from a study of outpatient healthcare utilization for AD in which he and his coinvestigator analyzed data from the National Ambulatory Medical Care Survey (NAMCS). In the article, published in 2023 covering data from the 1993-2015 NAMCS, they reported that AD visits were more common among children aged 0-4 years (32.0%) and 5-9 years of age (10.6%), then decreased in adolescents aged 10-19 years (11.6%), remained fairly steady in patients aged 20-89 years (1.0%-4.7%), and increased in patients aged > 90 years (20.7%).

“The peak usage for dermatologists, primary care physicians, etc., is happening in the first few years of life, partially because that’s when the disease is more common and more severe but also partially because that’s when parents and caregivers are first learning [about] the disease and trying to understand how to gain control,” Dr. Silverberg said at the meeting, presenting data from an expanded, unpublished analysis of NAMCS data showing these same outpatient utilization patterns. 

“It’s fascinating — there’s a much greater utilization in the elderly population. Why? The short answer is, we don’t know,” he said. 
 

 

 

Risk Factors, Immune Differences

People with adult-onset AD were more likely to be women, smokers in adulthood, and have a lower childhood socioeconomic status than those whose AD started in childhood in a longitudinal study of two large birth cohorts from the United Kingdom , Dr. Silverberg pointed out.

Patients with childhood-onset AD, meanwhile, were more likely to have asthma, allergen-specific immunoglobulin E (IgE), and known genetic polymorphisms previously associated with AD. (Each cohort — the 1958 British Cohort Study and the 1970 British Cohort Study — had more than 17,000 participants who were followed from birth through middle age.)

Data is limited, but “mechanistically,” AD in much older adults appears to have a unique serum cytokine pattern, Dr. Silverberg said. He pointed to a cross-sectional study in China of 1312 children and adults with AD in which researchers analyzed clinical features, serum samples, and skin biopsy samples.

Adults aged > 60 years showed more lesions on the trunk and extensor sites of the extremities and lower levels of serum IgE and peripheral eosinophil counts than those in younger age groups. And “interestingly,” compared with healthy controls, older patients with AD had “higher levels of serum expression of a variety of cytokines, including IL [interleukin]-4 but also high TARC levels ... and a variety of cytokines related to the Th17, TH1 axes, etc.,” he said. 

“So, we’re seeing a fascinating new profile that may be a little different than younger-onset cases,” he said, noting that TARC (thymus and activation-regulated chemokine) is regarded as a “decent biomarker” for AD.

In addition to higher levels of IL-4 and TARC, the study investigators reported significantly higher levels of IL-17A, IL-6, IL-22, IL-33, and thymic stromal lymphopoietin in older patients, compared with healthy controls.

Research also suggests that air pollution may play a role in the onset of AD in older age, Dr. Silverberg said, referencing a 2023 study that explored the association of air pollution and genetic risk with the onset of AD after age 50. The study analyzed 337,910 participants from the UK Biobank, with a median 12-year follow-up. Genetic risks were assessed as low, intermediate, and high, based on tertiles of polygenic risk scores. Exposure to various air pollutants was assessed using weighted quantile sum and also categorized into tertiles.

The incidence of older adult-onset AD was associated with medium and high air pollution compared with low air pollution, with hazard ratios (HRs) of 1.182 (P = .003) and 1.359 (P < .001), respectively. And “to a lesser extent,” Dr. Silverberg said, incidence was associated with medium and high genetic susceptibility, with HRs of 1.065 (P = .249) and 1.153 (P = .008).

The researchers calculated a greater population-attributable fraction of air pollution (15.5%) than genetic risk (6.4%). “This means that yes, genetics can contribute even to later-onset disease ... but environment may play an even more important role,” Dr. Silverberg said.

In the Clinic

In all patients, and especially in older adults, sleep disturbance associated with AD is a consideration for care. Data collected at the eczema clinic of Northwestern University, Chicago, Illinois, between 2014 and 2019 through previsit, self-administered questionnaires show that patients ≥ 65 years of age have more profound sleep disturbance (especially trouble staying asleep) than patients aged 18-64 years, despite having similar AD severity, said Dr. Silverberg, a coinvestigator of the study.

Older age was associated with having an increased number of nights of sleep disturbance (3-7 nights in the previous week) because of eczema (adjusted odds ratio [aOR], 2.14; 95% CI, 1.16-3.92). It was also associated with itching-attributed delays in falling asleep and nighttime awakenings in the prior 2 weeks (aOR, 1.88; 95% CI, 1.05-3.39). 

“The aging population has dysregulated sleep patterns and altered circadian rhythms, so some of this is just natural predisposition,” Dr. Silverberg said. “But it’s amplified [with AD and itching], and it becomes a big clinical problem when we get into treatment because it’s our natural inclination to prescribe antihistamines for their sedative properties.”

Antihistamines can cause more profound sedation, more forgetfulness, and more anticholinergic side effects, he said, noting that “there’s some evidence that high-dose antihistamines may exacerbate dementia.”

Medication side effects and medication interactions, comorbidities, and decreased renal and hepatic clearance all can complicate treatment of AD in older adults. So can mobility, the extent of social/caregiving support, and other aspects of aging. For example, “I’m a big fan of ‘soak and smears’ ... but you have to ask, can you get out of a bathtub safely?” Dr. Silverberg said. “And you have to ask, can you reach the areas you need to [in order] to apply topicals?”

With oral Janus kinase inhibitors and other systemic medications, as with other drugs, “our older population is the most vulnerable from a safety perspective,” he said. A recently published post hoc analysis of four randomized trials of dupilumab in adults ≥ 60 years of age with moderate to severe AD demonstrated efficacy comparable with that in younger patients and “a really clean safety profile,” said Dr. Silverberg, the lead author. “We really need more of these types of post hocs to have some relative contextualization” for older adults.

Dr. Silverberg reported being a speaker for AbbVie, Eli Lilly, Leo Pharma, Pfizer, Regeneron, and Sanofi-Genzyme; a consultant and/or advisory board member for Regeneron, Sanofi-Genzyme, and other companies; and an investigator for several companies.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

WASHINGTON, DC — The onset of atopic dermatitis (AD) in older adulthood — even in adults aged ≥ 90 years — is a phenomenon documented in the literature in recent years, with reports showing age-related immune differences and differences in risk factors, Jonathan I. Silverberg, MD, PhD, MPH, said at the ElderDerm Conference on dermatology in the older patient hosted by the George Washington University School of Medicine and Health Sciences, Washington, DC. 

“I walked out of residency under the impression that if it didn’t start in the first year or two of life, it’s not AD,” said Dr. Silverberg, professor of dermatology and director of clinical research at George Washington University. “The numbers tell us a very different story.” 

Dr. Jonathan I. Silverberg, professor of dermatology at George Washington University, Washington, DC
Dr. Silverberg
Dr. Jonathan I. Silverberg

The prevalence of AD in the United States fluctuates between 6% and 8% through adulthood, including age categories up to 81-85 years, according to 2012 National Health Interview Survey data. And while persistence of childhood-onset AD is common, a systematic review and meta-analysis published in 2018 concluded that one in four adults with AD report adult-onset disease. 

The investigators, including Dr. Silverberg, identified 25 observational studies — studies conducted across 16 countries and published during 1956-2017 — that included an analysis of age of onset beyond 10 years of age, and other inclusion criteria. Of the 25 studies, 17 reported age of onset after 16 years of age and had sufficient data for the meta-analysis. Using random-effects weighting, the investigators found a pooled proportion of adult-onset AD of 26.1% (95% CI, 16.5%-37.2%).

The research demonstrates that “the age of onset is distributed well throughout the lifespan,” Dr. Silverberg said, with the data “indicating there are many elderly-onset cases of true AD as well.” (Thirteen of the studies analyzed an age of onset from age ≥ 65, and several looked beyond age 80). 

A 2021 study of a primary care database in the United Kingdom of 3.85 million children and adults found a “fascinating” bimodal distribution of incidence across the lifespan, with peaks in both infancy and older adulthood, he said. Incidence in adulthood was relatively stable from ages 18-49 years, after which, “into the 50s, 60s and beyond, you started to see a steady climb again.” 

Also intriguing, Dr. Silverberg continued, are findings from a study of outpatient healthcare utilization for AD in which he and his coinvestigator analyzed data from the National Ambulatory Medical Care Survey (NAMCS). In the article, published in 2023 covering data from the 1993-2015 NAMCS, they reported that AD visits were more common among children aged 0-4 years (32.0%) and 5-9 years of age (10.6%), then decreased in adolescents aged 10-19 years (11.6%), remained fairly steady in patients aged 20-89 years (1.0%-4.7%), and increased in patients aged > 90 years (20.7%).

“The peak usage for dermatologists, primary care physicians, etc., is happening in the first few years of life, partially because that’s when the disease is more common and more severe but also partially because that’s when parents and caregivers are first learning [about] the disease and trying to understand how to gain control,” Dr. Silverberg said at the meeting, presenting data from an expanded, unpublished analysis of NAMCS data showing these same outpatient utilization patterns. 

“It’s fascinating — there’s a much greater utilization in the elderly population. Why? The short answer is, we don’t know,” he said. 
 

 

 

Risk Factors, Immune Differences

People with adult-onset AD were more likely to be women, smokers in adulthood, and have a lower childhood socioeconomic status than those whose AD started in childhood in a longitudinal study of two large birth cohorts from the United Kingdom , Dr. Silverberg pointed out.

Patients with childhood-onset AD, meanwhile, were more likely to have asthma, allergen-specific immunoglobulin E (IgE), and known genetic polymorphisms previously associated with AD. (Each cohort — the 1958 British Cohort Study and the 1970 British Cohort Study — had more than 17,000 participants who were followed from birth through middle age.)

Data is limited, but “mechanistically,” AD in much older adults appears to have a unique serum cytokine pattern, Dr. Silverberg said. He pointed to a cross-sectional study in China of 1312 children and adults with AD in which researchers analyzed clinical features, serum samples, and skin biopsy samples.

Adults aged > 60 years showed more lesions on the trunk and extensor sites of the extremities and lower levels of serum IgE and peripheral eosinophil counts than those in younger age groups. And “interestingly,” compared with healthy controls, older patients with AD had “higher levels of serum expression of a variety of cytokines, including IL [interleukin]-4 but also high TARC levels ... and a variety of cytokines related to the Th17, TH1 axes, etc.,” he said. 

“So, we’re seeing a fascinating new profile that may be a little different than younger-onset cases,” he said, noting that TARC (thymus and activation-regulated chemokine) is regarded as a “decent biomarker” for AD.

In addition to higher levels of IL-4 and TARC, the study investigators reported significantly higher levels of IL-17A, IL-6, IL-22, IL-33, and thymic stromal lymphopoietin in older patients, compared with healthy controls.

Research also suggests that air pollution may play a role in the onset of AD in older age, Dr. Silverberg said, referencing a 2023 study that explored the association of air pollution and genetic risk with the onset of AD after age 50. The study analyzed 337,910 participants from the UK Biobank, with a median 12-year follow-up. Genetic risks were assessed as low, intermediate, and high, based on tertiles of polygenic risk scores. Exposure to various air pollutants was assessed using weighted quantile sum and also categorized into tertiles.

The incidence of older adult-onset AD was associated with medium and high air pollution compared with low air pollution, with hazard ratios (HRs) of 1.182 (P = .003) and 1.359 (P < .001), respectively. And “to a lesser extent,” Dr. Silverberg said, incidence was associated with medium and high genetic susceptibility, with HRs of 1.065 (P = .249) and 1.153 (P = .008).

The researchers calculated a greater population-attributable fraction of air pollution (15.5%) than genetic risk (6.4%). “This means that yes, genetics can contribute even to later-onset disease ... but environment may play an even more important role,” Dr. Silverberg said.

In the Clinic

In all patients, and especially in older adults, sleep disturbance associated with AD is a consideration for care. Data collected at the eczema clinic of Northwestern University, Chicago, Illinois, between 2014 and 2019 through previsit, self-administered questionnaires show that patients ≥ 65 years of age have more profound sleep disturbance (especially trouble staying asleep) than patients aged 18-64 years, despite having similar AD severity, said Dr. Silverberg, a coinvestigator of the study.

Older age was associated with having an increased number of nights of sleep disturbance (3-7 nights in the previous week) because of eczema (adjusted odds ratio [aOR], 2.14; 95% CI, 1.16-3.92). It was also associated with itching-attributed delays in falling asleep and nighttime awakenings in the prior 2 weeks (aOR, 1.88; 95% CI, 1.05-3.39). 

“The aging population has dysregulated sleep patterns and altered circadian rhythms, so some of this is just natural predisposition,” Dr. Silverberg said. “But it’s amplified [with AD and itching], and it becomes a big clinical problem when we get into treatment because it’s our natural inclination to prescribe antihistamines for their sedative properties.”

Antihistamines can cause more profound sedation, more forgetfulness, and more anticholinergic side effects, he said, noting that “there’s some evidence that high-dose antihistamines may exacerbate dementia.”

Medication side effects and medication interactions, comorbidities, and decreased renal and hepatic clearance all can complicate treatment of AD in older adults. So can mobility, the extent of social/caregiving support, and other aspects of aging. For example, “I’m a big fan of ‘soak and smears’ ... but you have to ask, can you get out of a bathtub safely?” Dr. Silverberg said. “And you have to ask, can you reach the areas you need to [in order] to apply topicals?”

With oral Janus kinase inhibitors and other systemic medications, as with other drugs, “our older population is the most vulnerable from a safety perspective,” he said. A recently published post hoc analysis of four randomized trials of dupilumab in adults ≥ 60 years of age with moderate to severe AD demonstrated efficacy comparable with that in younger patients and “a really clean safety profile,” said Dr. Silverberg, the lead author. “We really need more of these types of post hocs to have some relative contextualization” for older adults.

Dr. Silverberg reported being a speaker for AbbVie, Eli Lilly, Leo Pharma, Pfizer, Regeneron, and Sanofi-Genzyme; a consultant and/or advisory board member for Regeneron, Sanofi-Genzyme, and other companies; and an investigator for several companies.

A version of this article first appeared on Medscape.com.

WASHINGTON, DC — The onset of atopic dermatitis (AD) in older adulthood — even in adults aged ≥ 90 years — is a phenomenon documented in the literature in recent years, with reports showing age-related immune differences and differences in risk factors, Jonathan I. Silverberg, MD, PhD, MPH, said at the ElderDerm Conference on dermatology in the older patient hosted by the George Washington University School of Medicine and Health Sciences, Washington, DC. 

“I walked out of residency under the impression that if it didn’t start in the first year or two of life, it’s not AD,” said Dr. Silverberg, professor of dermatology and director of clinical research at George Washington University. “The numbers tell us a very different story.” 

Dr. Jonathan I. Silverberg, professor of dermatology at George Washington University, Washington, DC
Dr. Silverberg
Dr. Jonathan I. Silverberg

The prevalence of AD in the United States fluctuates between 6% and 8% through adulthood, including age categories up to 81-85 years, according to 2012 National Health Interview Survey data. And while persistence of childhood-onset AD is common, a systematic review and meta-analysis published in 2018 concluded that one in four adults with AD report adult-onset disease. 

The investigators, including Dr. Silverberg, identified 25 observational studies — studies conducted across 16 countries and published during 1956-2017 — that included an analysis of age of onset beyond 10 years of age, and other inclusion criteria. Of the 25 studies, 17 reported age of onset after 16 years of age and had sufficient data for the meta-analysis. Using random-effects weighting, the investigators found a pooled proportion of adult-onset AD of 26.1% (95% CI, 16.5%-37.2%).

The research demonstrates that “the age of onset is distributed well throughout the lifespan,” Dr. Silverberg said, with the data “indicating there are many elderly-onset cases of true AD as well.” (Thirteen of the studies analyzed an age of onset from age ≥ 65, and several looked beyond age 80). 

A 2021 study of a primary care database in the United Kingdom of 3.85 million children and adults found a “fascinating” bimodal distribution of incidence across the lifespan, with peaks in both infancy and older adulthood, he said. Incidence in adulthood was relatively stable from ages 18-49 years, after which, “into the 50s, 60s and beyond, you started to see a steady climb again.” 

Also intriguing, Dr. Silverberg continued, are findings from a study of outpatient healthcare utilization for AD in which he and his coinvestigator analyzed data from the National Ambulatory Medical Care Survey (NAMCS). In the article, published in 2023 covering data from the 1993-2015 NAMCS, they reported that AD visits were more common among children aged 0-4 years (32.0%) and 5-9 years of age (10.6%), then decreased in adolescents aged 10-19 years (11.6%), remained fairly steady in patients aged 20-89 years (1.0%-4.7%), and increased in patients aged > 90 years (20.7%).

“The peak usage for dermatologists, primary care physicians, etc., is happening in the first few years of life, partially because that’s when the disease is more common and more severe but also partially because that’s when parents and caregivers are first learning [about] the disease and trying to understand how to gain control,” Dr. Silverberg said at the meeting, presenting data from an expanded, unpublished analysis of NAMCS data showing these same outpatient utilization patterns. 

“It’s fascinating — there’s a much greater utilization in the elderly population. Why? The short answer is, we don’t know,” he said. 
 

 

 

Risk Factors, Immune Differences

People with adult-onset AD were more likely to be women, smokers in adulthood, and have a lower childhood socioeconomic status than those whose AD started in childhood in a longitudinal study of two large birth cohorts from the United Kingdom , Dr. Silverberg pointed out.

Patients with childhood-onset AD, meanwhile, were more likely to have asthma, allergen-specific immunoglobulin E (IgE), and known genetic polymorphisms previously associated with AD. (Each cohort — the 1958 British Cohort Study and the 1970 British Cohort Study — had more than 17,000 participants who were followed from birth through middle age.)

Data is limited, but “mechanistically,” AD in much older adults appears to have a unique serum cytokine pattern, Dr. Silverberg said. He pointed to a cross-sectional study in China of 1312 children and adults with AD in which researchers analyzed clinical features, serum samples, and skin biopsy samples.

Adults aged > 60 years showed more lesions on the trunk and extensor sites of the extremities and lower levels of serum IgE and peripheral eosinophil counts than those in younger age groups. And “interestingly,” compared with healthy controls, older patients with AD had “higher levels of serum expression of a variety of cytokines, including IL [interleukin]-4 but also high TARC levels ... and a variety of cytokines related to the Th17, TH1 axes, etc.,” he said. 

“So, we’re seeing a fascinating new profile that may be a little different than younger-onset cases,” he said, noting that TARC (thymus and activation-regulated chemokine) is regarded as a “decent biomarker” for AD.

In addition to higher levels of IL-4 and TARC, the study investigators reported significantly higher levels of IL-17A, IL-6, IL-22, IL-33, and thymic stromal lymphopoietin in older patients, compared with healthy controls.

Research also suggests that air pollution may play a role in the onset of AD in older age, Dr. Silverberg said, referencing a 2023 study that explored the association of air pollution and genetic risk with the onset of AD after age 50. The study analyzed 337,910 participants from the UK Biobank, with a median 12-year follow-up. Genetic risks were assessed as low, intermediate, and high, based on tertiles of polygenic risk scores. Exposure to various air pollutants was assessed using weighted quantile sum and also categorized into tertiles.

The incidence of older adult-onset AD was associated with medium and high air pollution compared with low air pollution, with hazard ratios (HRs) of 1.182 (P = .003) and 1.359 (P < .001), respectively. And “to a lesser extent,” Dr. Silverberg said, incidence was associated with medium and high genetic susceptibility, with HRs of 1.065 (P = .249) and 1.153 (P = .008).

The researchers calculated a greater population-attributable fraction of air pollution (15.5%) than genetic risk (6.4%). “This means that yes, genetics can contribute even to later-onset disease ... but environment may play an even more important role,” Dr. Silverberg said.

In the Clinic

In all patients, and especially in older adults, sleep disturbance associated with AD is a consideration for care. Data collected at the eczema clinic of Northwestern University, Chicago, Illinois, between 2014 and 2019 through previsit, self-administered questionnaires show that patients ≥ 65 years of age have more profound sleep disturbance (especially trouble staying asleep) than patients aged 18-64 years, despite having similar AD severity, said Dr. Silverberg, a coinvestigator of the study.

Older age was associated with having an increased number of nights of sleep disturbance (3-7 nights in the previous week) because of eczema (adjusted odds ratio [aOR], 2.14; 95% CI, 1.16-3.92). It was also associated with itching-attributed delays in falling asleep and nighttime awakenings in the prior 2 weeks (aOR, 1.88; 95% CI, 1.05-3.39). 

“The aging population has dysregulated sleep patterns and altered circadian rhythms, so some of this is just natural predisposition,” Dr. Silverberg said. “But it’s amplified [with AD and itching], and it becomes a big clinical problem when we get into treatment because it’s our natural inclination to prescribe antihistamines for their sedative properties.”

Antihistamines can cause more profound sedation, more forgetfulness, and more anticholinergic side effects, he said, noting that “there’s some evidence that high-dose antihistamines may exacerbate dementia.”

Medication side effects and medication interactions, comorbidities, and decreased renal and hepatic clearance all can complicate treatment of AD in older adults. So can mobility, the extent of social/caregiving support, and other aspects of aging. For example, “I’m a big fan of ‘soak and smears’ ... but you have to ask, can you get out of a bathtub safely?” Dr. Silverberg said. “And you have to ask, can you reach the areas you need to [in order] to apply topicals?”

With oral Janus kinase inhibitors and other systemic medications, as with other drugs, “our older population is the most vulnerable from a safety perspective,” he said. A recently published post hoc analysis of four randomized trials of dupilumab in adults ≥ 60 years of age with moderate to severe AD demonstrated efficacy comparable with that in younger patients and “a really clean safety profile,” said Dr. Silverberg, the lead author. “We really need more of these types of post hocs to have some relative contextualization” for older adults.

Dr. Silverberg reported being a speaker for AbbVie, Eli Lilly, Leo Pharma, Pfizer, Regeneron, and Sanofi-Genzyme; a consultant and/or advisory board member for Regeneron, Sanofi-Genzyme, and other companies; and an investigator for several companies.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ELDERDERM 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article