Slot System
Featured Buckets
Featured Buckets Admin
Reverse Chronological Sort

Early Knee Osteoarthritis: Exercise Therapy’s Golden Window

Article Type
Changed
Mon, 08/05/2024 - 09:51

 

TOPLINE:

People with knee osteoarthritis and symptoms for less than 1 year benefit more from exercise therapy than do those with longer symptom duration, especially when long-term outcomes are considered.

METHODOLOGY:

  • Researchers conducted an individual participant data meta-analysis using data from the OA Trial Bank, including 1769 participants (mean age, 65.1 years; 66% women) with knee osteoarthritis from 10 randomized controlled trials.
  • The participants were categorized on the basis of their symptom duration: ≤ 1 year, > 1 and ≤ 2 years, and > 2 years.
  • This study included an exercise therapy group comprising land- and water-based therapeutic exercise interventions and a control group comprising no exercise or sham treatment.
  • The primary outcomes were self-reported pain and physical function, standardized to a 0-100 scale, at short-term (closest to 3 months) and long-term (closest to 12 months) follow-ups.

TAKEAWAY:

  • The overall pain and physical function associated with osteoarthritis improved in the exercise therapy group at both short- and long-term follow-ups compared with in the control group.
  • Exercise therapy led to a greater improvement in short-term (mean difference [MD], −3.57; P = .028) and long-term (MD, −8.33; P < .001) pain among participants with a symptom duration ≤ 1 year vs > 1 year.
  • Similarly, those with a symptom duration ≤ 2 years vs > 2 years who underwent exercise therapy showed greater benefits in terms of short-term (P = .001) and long-term (P < .001) pain.
  • Exercise therapy improved long-term physical function in those with a symptom duration ≤ 1 year vs > 1 year (MD, −5.46; P = .005) and ≤ 2 years vs > 2 years (MD, −4.56; P = .001).

IN PRACTICE:

“Exercise should be encouraged as early as possible once symptoms emerge in the disease process to take advantage of its effects in potentially [slowing] disease progression within the suggested ‘window of opportunity,’ ” the authors wrote.

SOURCE:

The study was led by Marienke van Middelkoop, PhD, Erasmus MC Medical University, Rotterdam, the Netherlands. It was published online in Osteoarthritis and Cartilage.

LIMITATIONS:

The dataset of most studies included in the meta-analysis lacked information on the radiographic severity of osteoarthritis. The relatively short follow-up time hindered interpreting the impact of exercise on the long-term progression of osteoarthritis. The reliance on patient recall for recording symptom duration may have led to misclassification.

DISCLOSURES:

The Netherlands Organisation for Health Research and Development supported this study. Three authors received funding from the Dutch Arthritis Society for the program grant Center of Excellence “OA prevention and early treatment – OA Pearl.” One author declared receiving royalties for the UpToDate knee osteoarthritis clinical guidelines.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

People with knee osteoarthritis and symptoms for less than 1 year benefit more from exercise therapy than do those with longer symptom duration, especially when long-term outcomes are considered.

METHODOLOGY:

  • Researchers conducted an individual participant data meta-analysis using data from the OA Trial Bank, including 1769 participants (mean age, 65.1 years; 66% women) with knee osteoarthritis from 10 randomized controlled trials.
  • The participants were categorized on the basis of their symptom duration: ≤ 1 year, > 1 and ≤ 2 years, and > 2 years.
  • This study included an exercise therapy group comprising land- and water-based therapeutic exercise interventions and a control group comprising no exercise or sham treatment.
  • The primary outcomes were self-reported pain and physical function, standardized to a 0-100 scale, at short-term (closest to 3 months) and long-term (closest to 12 months) follow-ups.

TAKEAWAY:

  • The overall pain and physical function associated with osteoarthritis improved in the exercise therapy group at both short- and long-term follow-ups compared with in the control group.
  • Exercise therapy led to a greater improvement in short-term (mean difference [MD], −3.57; P = .028) and long-term (MD, −8.33; P < .001) pain among participants with a symptom duration ≤ 1 year vs > 1 year.
  • Similarly, those with a symptom duration ≤ 2 years vs > 2 years who underwent exercise therapy showed greater benefits in terms of short-term (P = .001) and long-term (P < .001) pain.
  • Exercise therapy improved long-term physical function in those with a symptom duration ≤ 1 year vs > 1 year (MD, −5.46; P = .005) and ≤ 2 years vs > 2 years (MD, −4.56; P = .001).

IN PRACTICE:

“Exercise should be encouraged as early as possible once symptoms emerge in the disease process to take advantage of its effects in potentially [slowing] disease progression within the suggested ‘window of opportunity,’ ” the authors wrote.

SOURCE:

The study was led by Marienke van Middelkoop, PhD, Erasmus MC Medical University, Rotterdam, the Netherlands. It was published online in Osteoarthritis and Cartilage.

LIMITATIONS:

The dataset of most studies included in the meta-analysis lacked information on the radiographic severity of osteoarthritis. The relatively short follow-up time hindered interpreting the impact of exercise on the long-term progression of osteoarthritis. The reliance on patient recall for recording symptom duration may have led to misclassification.

DISCLOSURES:

The Netherlands Organisation for Health Research and Development supported this study. Three authors received funding from the Dutch Arthritis Society for the program grant Center of Excellence “OA prevention and early treatment – OA Pearl.” One author declared receiving royalties for the UpToDate knee osteoarthritis clinical guidelines.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

People with knee osteoarthritis and symptoms for less than 1 year benefit more from exercise therapy than do those with longer symptom duration, especially when long-term outcomes are considered.

METHODOLOGY:

  • Researchers conducted an individual participant data meta-analysis using data from the OA Trial Bank, including 1769 participants (mean age, 65.1 years; 66% women) with knee osteoarthritis from 10 randomized controlled trials.
  • The participants were categorized on the basis of their symptom duration: ≤ 1 year, > 1 and ≤ 2 years, and > 2 years.
  • This study included an exercise therapy group comprising land- and water-based therapeutic exercise interventions and a control group comprising no exercise or sham treatment.
  • The primary outcomes were self-reported pain and physical function, standardized to a 0-100 scale, at short-term (closest to 3 months) and long-term (closest to 12 months) follow-ups.

TAKEAWAY:

  • The overall pain and physical function associated with osteoarthritis improved in the exercise therapy group at both short- and long-term follow-ups compared with in the control group.
  • Exercise therapy led to a greater improvement in short-term (mean difference [MD], −3.57; P = .028) and long-term (MD, −8.33; P < .001) pain among participants with a symptom duration ≤ 1 year vs > 1 year.
  • Similarly, those with a symptom duration ≤ 2 years vs > 2 years who underwent exercise therapy showed greater benefits in terms of short-term (P = .001) and long-term (P < .001) pain.
  • Exercise therapy improved long-term physical function in those with a symptom duration ≤ 1 year vs > 1 year (MD, −5.46; P = .005) and ≤ 2 years vs > 2 years (MD, −4.56; P = .001).

IN PRACTICE:

“Exercise should be encouraged as early as possible once symptoms emerge in the disease process to take advantage of its effects in potentially [slowing] disease progression within the suggested ‘window of opportunity,’ ” the authors wrote.

SOURCE:

The study was led by Marienke van Middelkoop, PhD, Erasmus MC Medical University, Rotterdam, the Netherlands. It was published online in Osteoarthritis and Cartilage.

LIMITATIONS:

The dataset of most studies included in the meta-analysis lacked information on the radiographic severity of osteoarthritis. The relatively short follow-up time hindered interpreting the impact of exercise on the long-term progression of osteoarthritis. The reliance on patient recall for recording symptom duration may have led to misclassification.

DISCLOSURES:

The Netherlands Organisation for Health Research and Development supported this study. Three authors received funding from the Dutch Arthritis Society for the program grant Center of Excellence “OA prevention and early treatment – OA Pearl.” One author declared receiving royalties for the UpToDate knee osteoarthritis clinical guidelines.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Physician-Scientist Taps into Microbiome to Fight Cancer

Article Type
Changed
Wed, 08/07/2024 - 15:04

The lowest point in the nascent career of Neelendu Dey, MD, helped seal his fate as a physician-scientist.

He had just started his first year as a resident at University of California, San Francisco. One of his patients was a 30-year-old woman who was dying of metastatic colorectal cancer. “I was in my mid-20s interacting with an individual just a few years older than I am, going through one of the most terrible health outcomes one could imagine,” Dr. Dey said.

He remembers asking the patient what he could do for her, how he could make her feel more comfortable. “That feeling of helplessness, particularly as we think about young people developing cancer, it really stuck with me through the years,” he said.

Dr. Dey, a graduate of the AGA Future Leaders Program, is now a gastroenterologist and researcher at the Fred Hutch Cancer Center in Seattle. In an interview, he talked about his dual role as a physician and scientist, and how those two interests are guiding his research in precancerous conditions of the colon.

Cases like that of the young woman with colon cancer “really help drive the urgency of the work we do, and the research questions we ask, as we try to move the ball forward and help folks at earlier stages,” he said.

Dr. Neelendu Dey, Fred Hutch Cancer Center, Seattle, Washington
Fred Hutch Cancer Center
Dr. Neelendu Dey

 

Q: Why did you choose GI?

When you think about what sorts of chronic diseases really impact your quality of life, gut health is one of the chief contributors among various aspects of health. And that really appealed to me — the ability to take someone who is essentially handicapped by a series of illnesses and symptoms that derive from the GI tract and enable them to return to the person they want to be, to be productive in the way that they want to be, and have a rewarding life.

As I thought about how I wanted to contribute to the future of medicine, one of the ways in which I’ve always thought that I would do that is through research. When I considered the fields that really appealed to me, both from that clinical standpoint and research standpoint, GI was one that really stood out. There has been a lot of exciting research going on in GI. My lab currently studies the microbiome, and I feel like this is an area in which we can contribute.
 

Q: What role does digestive health play in overall health?

Obviously, the direct answer is gut health is so critical in something like nutritional intake. Some GI symptoms, if your gut health has gone awry, can really be detrimental in terms of quality of life. But one less obvious role that digestive health plays is its long-term effects. We’re starting to appreciate that gut health, the gut microbiome, and gut immune education are probably long-term players. Some experiences in early life might shape our immunity in ways that have consequences for us much later in life. Whether we get early life antibiotics, for example, may potentially contribute to colorectal cancer down the line. Thinking about the long-term players is more challenging, but it’s also an appealing opportunity as we think about how we can shape medicine moving forward.

 

 

Q: What practice challenges have you faced in your career?

First, being a physician-scientist. It’s challenging to be either a physician alone or to be a researcher alone. And trying to do both includes the challenges of both individual worlds. It just takes more time to get all the prerequisite training. And second, there are just challenges with getting the opportunities to contribute in the ways that you want — to get the research funding, to get the papers out, things like that.

Q: Tell me about the work you’ve been doing in your lab to develop microbiome-based strategies for preventing and treating cancer.

The microbiome presents several opportunities when it comes to cancer prevention. One is identifying markers of cancer risk, or of general good health down the line. Some of those biomarkers could — potentially — feed directly into personalized risk assessment and maybe even inform a future screening strategy. The second opportunity the microbiome presents is if we identify a microbe that influences your cancer risk, can we then understand and exploit, or utilize, that mechanism to mitigate cancer risk in the future? Our lab has done work looking at subspecies levels of microbes that track with health or cancer. We’ve done some work to identify what these subspecies groupings are and have identified some links to certain precancerous changes in the colon. We think that there’s an opportunity here for future interventions.

Q: Have you published other papers?

We recently published another paper describing how some microbes can interact with a tumor suppressor gene and are influenced in a sex-biased manner to drive tumorigenesis in a mouse model. We think, based on what we’re seeing in human data, that there may be some relationships and we’re exploring that now as well. 

Q: What is your vision for the future in GI, and in your career?

The vision that I have is to create clinical tools that can expand our reach and our effectiveness and cancer prevention. I think that there are opportunities for leveraging microbiome research to accomplish this. And one outcome I could imagine is leveraging some of these insights to expand noninvasive screening at even earlier ages than we do now. I mean, we just dialed back the recommended age for colonoscopy for average risk individuals to 45. But I could envision a future in which noninvasive screening starts earlier, in which the first stool-based tests that we deploy to assess personalized risk are used in the pediatric clinic.

Lightning Round

Texting or talking?

Talking

Favorite city in the United States besides the one you live in?

St. Louis

Cat or dog person?

Both

If you weren’t a GI, what would you be?

Musician

Best place you went on vacation?

Borneo

Favorite sport?

Soccer

Favorite ice cream?

Cashew-based salted caramel

What song do you have to sing along with when you hear it?

Sweet Child of Mine

Favorite movie or TV show?

25th Hour or Shawshank Redemption

Optimist or Pessimist?

Optimist

Publications
Topics
Sections

The lowest point in the nascent career of Neelendu Dey, MD, helped seal his fate as a physician-scientist.

He had just started his first year as a resident at University of California, San Francisco. One of his patients was a 30-year-old woman who was dying of metastatic colorectal cancer. “I was in my mid-20s interacting with an individual just a few years older than I am, going through one of the most terrible health outcomes one could imagine,” Dr. Dey said.

He remembers asking the patient what he could do for her, how he could make her feel more comfortable. “That feeling of helplessness, particularly as we think about young people developing cancer, it really stuck with me through the years,” he said.

Dr. Dey, a graduate of the AGA Future Leaders Program, is now a gastroenterologist and researcher at the Fred Hutch Cancer Center in Seattle. In an interview, he talked about his dual role as a physician and scientist, and how those two interests are guiding his research in precancerous conditions of the colon.

Cases like that of the young woman with colon cancer “really help drive the urgency of the work we do, and the research questions we ask, as we try to move the ball forward and help folks at earlier stages,” he said.

Dr. Neelendu Dey, Fred Hutch Cancer Center, Seattle, Washington
Fred Hutch Cancer Center
Dr. Neelendu Dey

 

Q: Why did you choose GI?

When you think about what sorts of chronic diseases really impact your quality of life, gut health is one of the chief contributors among various aspects of health. And that really appealed to me — the ability to take someone who is essentially handicapped by a series of illnesses and symptoms that derive from the GI tract and enable them to return to the person they want to be, to be productive in the way that they want to be, and have a rewarding life.

As I thought about how I wanted to contribute to the future of medicine, one of the ways in which I’ve always thought that I would do that is through research. When I considered the fields that really appealed to me, both from that clinical standpoint and research standpoint, GI was one that really stood out. There has been a lot of exciting research going on in GI. My lab currently studies the microbiome, and I feel like this is an area in which we can contribute.
 

Q: What role does digestive health play in overall health?

Obviously, the direct answer is gut health is so critical in something like nutritional intake. Some GI symptoms, if your gut health has gone awry, can really be detrimental in terms of quality of life. But one less obvious role that digestive health plays is its long-term effects. We’re starting to appreciate that gut health, the gut microbiome, and gut immune education are probably long-term players. Some experiences in early life might shape our immunity in ways that have consequences for us much later in life. Whether we get early life antibiotics, for example, may potentially contribute to colorectal cancer down the line. Thinking about the long-term players is more challenging, but it’s also an appealing opportunity as we think about how we can shape medicine moving forward.

 

 

Q: What practice challenges have you faced in your career?

First, being a physician-scientist. It’s challenging to be either a physician alone or to be a researcher alone. And trying to do both includes the challenges of both individual worlds. It just takes more time to get all the prerequisite training. And second, there are just challenges with getting the opportunities to contribute in the ways that you want — to get the research funding, to get the papers out, things like that.

Q: Tell me about the work you’ve been doing in your lab to develop microbiome-based strategies for preventing and treating cancer.

The microbiome presents several opportunities when it comes to cancer prevention. One is identifying markers of cancer risk, or of general good health down the line. Some of those biomarkers could — potentially — feed directly into personalized risk assessment and maybe even inform a future screening strategy. The second opportunity the microbiome presents is if we identify a microbe that influences your cancer risk, can we then understand and exploit, or utilize, that mechanism to mitigate cancer risk in the future? Our lab has done work looking at subspecies levels of microbes that track with health or cancer. We’ve done some work to identify what these subspecies groupings are and have identified some links to certain precancerous changes in the colon. We think that there’s an opportunity here for future interventions.

Q: Have you published other papers?

We recently published another paper describing how some microbes can interact with a tumor suppressor gene and are influenced in a sex-biased manner to drive tumorigenesis in a mouse model. We think, based on what we’re seeing in human data, that there may be some relationships and we’re exploring that now as well. 

Q: What is your vision for the future in GI, and in your career?

The vision that I have is to create clinical tools that can expand our reach and our effectiveness and cancer prevention. I think that there are opportunities for leveraging microbiome research to accomplish this. And one outcome I could imagine is leveraging some of these insights to expand noninvasive screening at even earlier ages than we do now. I mean, we just dialed back the recommended age for colonoscopy for average risk individuals to 45. But I could envision a future in which noninvasive screening starts earlier, in which the first stool-based tests that we deploy to assess personalized risk are used in the pediatric clinic.

Lightning Round

Texting or talking?

Talking

Favorite city in the United States besides the one you live in?

St. Louis

Cat or dog person?

Both

If you weren’t a GI, what would you be?

Musician

Best place you went on vacation?

Borneo

Favorite sport?

Soccer

Favorite ice cream?

Cashew-based salted caramel

What song do you have to sing along with when you hear it?

Sweet Child of Mine

Favorite movie or TV show?

25th Hour or Shawshank Redemption

Optimist or Pessimist?

Optimist

The lowest point in the nascent career of Neelendu Dey, MD, helped seal his fate as a physician-scientist.

He had just started his first year as a resident at University of California, San Francisco. One of his patients was a 30-year-old woman who was dying of metastatic colorectal cancer. “I was in my mid-20s interacting with an individual just a few years older than I am, going through one of the most terrible health outcomes one could imagine,” Dr. Dey said.

He remembers asking the patient what he could do for her, how he could make her feel more comfortable. “That feeling of helplessness, particularly as we think about young people developing cancer, it really stuck with me through the years,” he said.

Dr. Dey, a graduate of the AGA Future Leaders Program, is now a gastroenterologist and researcher at the Fred Hutch Cancer Center in Seattle. In an interview, he talked about his dual role as a physician and scientist, and how those two interests are guiding his research in precancerous conditions of the colon.

Cases like that of the young woman with colon cancer “really help drive the urgency of the work we do, and the research questions we ask, as we try to move the ball forward and help folks at earlier stages,” he said.

Dr. Neelendu Dey, Fred Hutch Cancer Center, Seattle, Washington
Fred Hutch Cancer Center
Dr. Neelendu Dey

 

Q: Why did you choose GI?

When you think about what sorts of chronic diseases really impact your quality of life, gut health is one of the chief contributors among various aspects of health. And that really appealed to me — the ability to take someone who is essentially handicapped by a series of illnesses and symptoms that derive from the GI tract and enable them to return to the person they want to be, to be productive in the way that they want to be, and have a rewarding life.

As I thought about how I wanted to contribute to the future of medicine, one of the ways in which I’ve always thought that I would do that is through research. When I considered the fields that really appealed to me, both from that clinical standpoint and research standpoint, GI was one that really stood out. There has been a lot of exciting research going on in GI. My lab currently studies the microbiome, and I feel like this is an area in which we can contribute.
 

Q: What role does digestive health play in overall health?

Obviously, the direct answer is gut health is so critical in something like nutritional intake. Some GI symptoms, if your gut health has gone awry, can really be detrimental in terms of quality of life. But one less obvious role that digestive health plays is its long-term effects. We’re starting to appreciate that gut health, the gut microbiome, and gut immune education are probably long-term players. Some experiences in early life might shape our immunity in ways that have consequences for us much later in life. Whether we get early life antibiotics, for example, may potentially contribute to colorectal cancer down the line. Thinking about the long-term players is more challenging, but it’s also an appealing opportunity as we think about how we can shape medicine moving forward.

 

 

Q: What practice challenges have you faced in your career?

First, being a physician-scientist. It’s challenging to be either a physician alone or to be a researcher alone. And trying to do both includes the challenges of both individual worlds. It just takes more time to get all the prerequisite training. And second, there are just challenges with getting the opportunities to contribute in the ways that you want — to get the research funding, to get the papers out, things like that.

Q: Tell me about the work you’ve been doing in your lab to develop microbiome-based strategies for preventing and treating cancer.

The microbiome presents several opportunities when it comes to cancer prevention. One is identifying markers of cancer risk, or of general good health down the line. Some of those biomarkers could — potentially — feed directly into personalized risk assessment and maybe even inform a future screening strategy. The second opportunity the microbiome presents is if we identify a microbe that influences your cancer risk, can we then understand and exploit, or utilize, that mechanism to mitigate cancer risk in the future? Our lab has done work looking at subspecies levels of microbes that track with health or cancer. We’ve done some work to identify what these subspecies groupings are and have identified some links to certain precancerous changes in the colon. We think that there’s an opportunity here for future interventions.

Q: Have you published other papers?

We recently published another paper describing how some microbes can interact with a tumor suppressor gene and are influenced in a sex-biased manner to drive tumorigenesis in a mouse model. We think, based on what we’re seeing in human data, that there may be some relationships and we’re exploring that now as well. 

Q: What is your vision for the future in GI, and in your career?

The vision that I have is to create clinical tools that can expand our reach and our effectiveness and cancer prevention. I think that there are opportunities for leveraging microbiome research to accomplish this. And one outcome I could imagine is leveraging some of these insights to expand noninvasive screening at even earlier ages than we do now. I mean, we just dialed back the recommended age for colonoscopy for average risk individuals to 45. But I could envision a future in which noninvasive screening starts earlier, in which the first stool-based tests that we deploy to assess personalized risk are used in the pediatric clinic.

Lightning Round

Texting or talking?

Talking

Favorite city in the United States besides the one you live in?

St. Louis

Cat or dog person?

Both

If you weren’t a GI, what would you be?

Musician

Best place you went on vacation?

Borneo

Favorite sport?

Soccer

Favorite ice cream?

Cashew-based salted caramel

What song do you have to sing along with when you hear it?

Sweet Child of Mine

Favorite movie or TV show?

25th Hour or Shawshank Redemption

Optimist or Pessimist?

Optimist

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Breakthrough Blood Test for Colorectal Cancer Gets Green Light

Article Type
Changed
Wed, 08/07/2024 - 14:59

 

A breakthrough in medical testing now allows for colorectal cancer screening with just a simple blood test, promising a more accessible and less invasive way to catch the disease early. 

The FDA on July 29 approved the test, called Shield, which can accurately detect tumors in the colon or rectum about 87% of the time when the cancer is in treatable early stages. The approval was announced July 29 by the test’s maker, Guardant Health, and comes just months after promising clinical trial results were published in The New England Journal of Medicine.

Colorectal cancer is among the most common types of cancer diagnosed in the United States each year, along with being one of the leading causes of cancer deaths. The condition is treatable in early stages, but about 1 in 3 people don’t stay up to date on regular screenings, which should begin at age 45.

The simplicity of a blood test could make it more likely for people to be screened for and, ultimately, survive the disease. Other primary screening options include feces-based tests or colonoscopy. The 5-year survival rate for colorectal cancer is 64%.

While highly accurate at detecting DNA shed by tumors during treatable stages of colorectal cancer, the Shield test was not as effective at detecting precancerous areas of tissue, which are typically removed after being detected.

In its news release, Guardant Health officials said they anticipate the test to be covered under Medicare. The out-of-pocket cost for people whose insurance does not cover the test has not yet been announced. The test is expected to be available by next week, The New York Times reported.

If someone’s Shield test comes back positive, the person would then get more tests to confirm the result. Shield was shown in trials to have a 10% false positive rate.

“I was in for a routine physical, and my doctor asked when I had my last colonoscopy,” said John Gormly, a 77-year-old business executive in Newport Beach, California, according to a Guardant Health news release. “I said it’s been a long time, so he offered to give me the Shield blood test. A few days later, the result came back positive, so he referred me for a colonoscopy. It turned out I had stage II colon cancer. The tumor was removed, and I recovered very quickly. Thank God I had taken that blood test.”
 

A version of this article appeared on WebMD.com.

Publications
Topics
Sections

 

A breakthrough in medical testing now allows for colorectal cancer screening with just a simple blood test, promising a more accessible and less invasive way to catch the disease early. 

The FDA on July 29 approved the test, called Shield, which can accurately detect tumors in the colon or rectum about 87% of the time when the cancer is in treatable early stages. The approval was announced July 29 by the test’s maker, Guardant Health, and comes just months after promising clinical trial results were published in The New England Journal of Medicine.

Colorectal cancer is among the most common types of cancer diagnosed in the United States each year, along with being one of the leading causes of cancer deaths. The condition is treatable in early stages, but about 1 in 3 people don’t stay up to date on regular screenings, which should begin at age 45.

The simplicity of a blood test could make it more likely for people to be screened for and, ultimately, survive the disease. Other primary screening options include feces-based tests or colonoscopy. The 5-year survival rate for colorectal cancer is 64%.

While highly accurate at detecting DNA shed by tumors during treatable stages of colorectal cancer, the Shield test was not as effective at detecting precancerous areas of tissue, which are typically removed after being detected.

In its news release, Guardant Health officials said they anticipate the test to be covered under Medicare. The out-of-pocket cost for people whose insurance does not cover the test has not yet been announced. The test is expected to be available by next week, The New York Times reported.

If someone’s Shield test comes back positive, the person would then get more tests to confirm the result. Shield was shown in trials to have a 10% false positive rate.

“I was in for a routine physical, and my doctor asked when I had my last colonoscopy,” said John Gormly, a 77-year-old business executive in Newport Beach, California, according to a Guardant Health news release. “I said it’s been a long time, so he offered to give me the Shield blood test. A few days later, the result came back positive, so he referred me for a colonoscopy. It turned out I had stage II colon cancer. The tumor was removed, and I recovered very quickly. Thank God I had taken that blood test.”
 

A version of this article appeared on WebMD.com.

 

A breakthrough in medical testing now allows for colorectal cancer screening with just a simple blood test, promising a more accessible and less invasive way to catch the disease early. 

The FDA on July 29 approved the test, called Shield, which can accurately detect tumors in the colon or rectum about 87% of the time when the cancer is in treatable early stages. The approval was announced July 29 by the test’s maker, Guardant Health, and comes just months after promising clinical trial results were published in The New England Journal of Medicine.

Colorectal cancer is among the most common types of cancer diagnosed in the United States each year, along with being one of the leading causes of cancer deaths. The condition is treatable in early stages, but about 1 in 3 people don’t stay up to date on regular screenings, which should begin at age 45.

The simplicity of a blood test could make it more likely for people to be screened for and, ultimately, survive the disease. Other primary screening options include feces-based tests or colonoscopy. The 5-year survival rate for colorectal cancer is 64%.

While highly accurate at detecting DNA shed by tumors during treatable stages of colorectal cancer, the Shield test was not as effective at detecting precancerous areas of tissue, which are typically removed after being detected.

In its news release, Guardant Health officials said they anticipate the test to be covered under Medicare. The out-of-pocket cost for people whose insurance does not cover the test has not yet been announced. The test is expected to be available by next week, The New York Times reported.

If someone’s Shield test comes back positive, the person would then get more tests to confirm the result. Shield was shown in trials to have a 10% false positive rate.

“I was in for a routine physical, and my doctor asked when I had my last colonoscopy,” said John Gormly, a 77-year-old business executive in Newport Beach, California, according to a Guardant Health news release. “I said it’s been a long time, so he offered to give me the Shield blood test. A few days later, the result came back positive, so he referred me for a colonoscopy. It turned out I had stage II colon cancer. The tumor was removed, and I recovered very quickly. Thank God I had taken that blood test.”
 

A version of this article appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Two Diets Linked to Improved Cognition, Slowed Brain Aging

Article Type
Changed
Wed, 07/31/2024 - 13:18

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

 

An intermittent fasting (IF) diet and a standard healthy living (HL) diet focused on healthy foods both lead to weight loss, reduced insulin resistance (IR), and slowed brain aging in older overweight adults with IR, new research showed. However, neither diet has an effect on Alzheimer’s disease (AD) biomarkers.

Although investigators found both diets were beneficial, some outcomes were more robust with the IF diet.

“The study provides a blueprint for assessing brain effects of dietary interventions and motivates further research on intermittent fasting and continuous diets for brain health optimization,” wrote the investigators, led by Dimitrios Kapogiannis, MD, chief, human neuroscience section, National Institute on Aging, and adjunct associate professor of neurology, the Johns Hopkins University School of Medicine.

The findings were published online in Cell Metabolism.
 

Cognitive Outcomes

The prevalence of IR — reduced cellular sensitivity to insulin that’s a hallmark of type 2 diabetes — increases with age and obesity, adding to an increased risk for accelerated brain aging as well as AD and related dementias (ADRD) in older adults who have overweight.

Studies reported healthy diets promote overall health, but it’s unclear whether, and to what extent, they improve brain health beyond general health enhancement.

Researchers used multiple brain and cognitive measures to assess dietary effects on brain health, including peripherally harvested neuron-derived extracellular vesicles (NDEVs) to probe neuronal insulin signaling; MRI to investigate the pace of brain aging; magnetic resonance spectroscopy (MRS) to measure brain glucose, metabolites, and neurotransmitters; and NDEVs and cerebrospinal fluid to derive biomarkers for AD/ADRD.

The study included 40 cognitively intact overweight participants with IR, mean age 63.2 years, 60% women, and 62.5% White. Their mean body weight was 97.1 kg and mean body mass index (BMI) was 34.4.

Participants were randomly assigned to 8 weeks of an IF diet or a HL diet that emphasizes fruits, vegetables, whole grains, lean proteins, and low-fat dairy and limits added sugars, saturated fats, and sodium.

The IF diet involved following the HL diet for 5 days per week and restricting calories to a quarter of the recommended daily intake for 2 consecutive days.

Both diets reduced neuronal IR and had comparable effects in improving insulin signaling biomarkers in NDEVs, reducing brain glucose on MRS, and improving blood biomarkers of carbohydrate and lipid metabolism.

Using MRI, researchers also assessed brain age, an indication of whether the brain appears older or younger than an individual’s chronological age. There was a decrease of 2.63 years with the IF diet (P = .05) and 2.42 years with the HL diet (P < .001) in the anterior cingulate and ventromedial prefrontal cortex.

Both diets improved executive function and memory, with those following the IF diet benefiting more in strategic planning, switching between two cognitively demanding tasks, cued recall, and other areas.
 

Hypothesis-Generating Research

AD biomarkers including amyloid beta 42 (Aß42), Aß40, and plasma phosphorylated-tau181 did not change with either diet, a finding that investigators speculated may be due to the short duration of the study. Light-chain neurofilaments increased across groups with no differences between the diets.

In other findings, BMI decreased by 1.41 with the IF diet and by 0.80 with the HL diet, and a similar pattern was observed for weight. Waist circumference decreased in both groups with no significant differences between diets.

An exploratory analysis showed executive function improved with the IF diet but not with the HL diet in women, whereas it improved with both diets in men. BMI and apolipoprotein E and SLC16A7 genotypes also modulated diet effects.

Both diets were well tolerated. The most frequent adverse events were gastrointestinal and occurred only with the IF diet.

The authors noted the findings are preliminary and results are hypothesis generating. Study limitations included the study’s short duration and its power to detect anything other than large to moderate effect size changes and differences between the diets. Researchers also didn’t acquire data on dietary intake, so lapses in adherence can’t be excluded. However, the large decreases in BMI, weight, and waist circumference with both diets indicated high adherence.

The study was supported by the National Institutes of Health’s National Institute on Aging. The authors reported no competing interests.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM CELL METABOLISM

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Heat Waves: A Silent Threat to Older Adults’ Kidneys

Article Type
Changed
Tue, 08/06/2024 - 02:25

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

 

TOPLINE:

Older adults show an increase in creatinine and cystatin C levels after exposure to extreme heat in a dry setting despite staying hydrated; however, changes in these kidney function biomarkers are much more modest in a humid setting and in young adults.

METHODOLOGY:

  • Older adults are vulnerable to heat-related morbidity and mortality, with kidney complications accounting for many excess hospital admissions during heat waves.
  • Researchers investigated plasma-based markers of kidney function following extreme heat exposure for 3 hours in 20 young (21-39 years) and 18 older (65-76 years) adults recruited from the Dallas-Fort Worth area.
  • All participants underwent heat exposure in a chamber at 47 °C (116 °F) and 15% relative humidity (dry setting) and 41 °C (105 °F) and 40% relative humidity (humid setting) on separate days. They performed light physical activity mimicking their daily tasks and drank 3 mL/kg body mass of water every hour while exposed to heat.
  • Blood samples were collected at baseline, immediately before the end of heat exposure (end-heating), and 2 hours after heat exposure.
  • Plasma creatinine was the primary outcome, with a change ≥ 0.3 mg/dL considered as clinically meaningful. Cystatin C was the secondary outcome.

TAKEAWAY:

  • The plasma creatinine level showed a modest increase from baseline to end-heating (difference, 0.10 mg/dL; P = .004) and at 2 hours post exposure (difference, 0.17 mg/dL; P < .001) in older adults facing heat exposure in the dry setting.
  • The mean cystatin C levels also increased from baseline to end-heating by 0.29 mg/L (P = .01) and at 2 hours post heat exposure by 0.28 mg/L (P = .004) in older adults in the dry setting.
  • The mean creatinine levels increased by only 0.06 mg/dL (P = .01) from baseline to 2 hours post exposure in older adults facing heat exposure in the humid setting.
  • Young adults didn’t show any significant change in the plasma cystatin C levels during or after heat exposure; however, there was a modest increase in the plasma creatinine levels after 2 hours of heat exposure (difference, 0.06; P = .004).

IN PRACTICE:

“These findings provide limited evidence that the heightened thermal strain in older adults during extreme heat may contribute to reduced kidney function,” the authors wrote. 

SOURCE:

The study was led by Zachary J. McKenna, PhD, from the Department of Internal Medicine, University of Texas Southwestern Medical Center, Dallas, Texas, and was published online in JAMA.

LIMITATIONS:

The use of plasma-based markers of kidney function, a short laboratory-based exposure, and a small number of generally healthy participants were the main limitations that could affect the generalizability of this study’s findings to broader populations and real-world settings. 

DISCLOSURES:

The National Institutes of Health and American Heart Association funded this study. Two authors declared receiving grants and nonfinancial support from several sources. 

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Risk Stratification May Work Well for FIT-Based CRC Screening in Elderly

Article Type
Changed
Wed, 08/07/2024 - 14:59

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

A risk-stratified upper age limit may be beneficial for colorectal cancer (CRC) screening among patients who are ages 75 and older, according to a study presented at the annual Digestive Disease Week® (DDW).

In particular, interval CRC risk can vary substantially based on the fecal hemoglobin (f-Hb) concentration in the patient’s last fecal immunochemical test (FIT), as well as the number of prior screening rounds.

“Less is known about what happens after the upper age limit has been reached and individuals are not invited to participate in more screening rounds. This is important as life expectancy is increasing, and it is increasingly important to consider the most efficient way of screening the elderly,” said lead author Brenda van Stigt, a PhD candidate focused on cancer screening at Erasmus University Medical Center in Rotterdam, the Netherlands.

In the Netherlands, adults between ages 55 and 75 are invited to participate in stool-based CRC screening every 2 years. Based on a fecal immunochemical testing (FIT) threshold of 47 μg Hb/g, those who test positive are referred to colonoscopy, and those who test negative are invited to participate again after a 2-year period.

FIT can play a major role in risk stratification, Ms. van Stigt noted, along with other factors that influence CRC risk, such as age, sex, and CRC screening history. Although this is documented for ages 55-75, she and colleagues wanted to know more about what happens after age 75.

Ms. Van Stigt and colleagues conducted a population-based study by analyzing Dutch national cancer registry data and FIT results around the final screening at age 75, looking at those who were diagnosed with CRC within 24 months of their last negative FIT. The researchers assessed interval CRC risk and cancer stage, accounting for sex, last f-Hb concentration, and the number of screening rounds.

Among 305,761 people with a complete 24-month follow-up after a negative FIT, 661 patients were diagnosed with interval CRC, indicating an overall interval CRC risk of 21.6 per 10,000 individuals with a negative FIT. There were no significant differences by sex.

However, there were differences by screening rounds, with those who had participated in three or four screening rounds having a lower risk than those who participated only once (HR, .49).

In addition, those with detectable f-Hb (>0 μg Hb/g) in their last screening round had a much higher interval CRC risk (HR, 4.87), at 65.8 per 10,000 negative FITs, compared with 13.8 per 10,000 among those without detectable f-Hb. Interval CRC risk also increased over time for those with detectable f-Hb.

About 15% of the total population had detectable f-Hb, whereas 46% of those with interval CRC had detectable f-Hb, Ms. van Stigt said, meaning that nearly half of patients who were diagnosed with interval CRC already had detectable f-Hb in their prior FIT.

In a survival analysis, there was no association between interval CRC risk and sex. However, those who participated in three or four screening rounds were half as likely to be diagnosed than those who participated once or twice, and those with detectable f-Hb were five times as likely to be diagnosed.

For late-stage CRC, there was no association with sex or the number of screening rounds. Detectable f-Hb was associated with not only a higher risk of interval CRC but also a late-stage diagnosis.

“These findings indicate that one uniform age to stop screening is suboptimal,” Ms. van Stigt said. “Personalized screening strategies should, therefore, also ideally incorporate a risk-stratified age to stop screening.”

The US Preventive Services Task Force recommends that clinicians personalize screening for ages 76-85, accounting for overall health, prior screening history, and patient preferences.

“But we have no clear guidance on how to quantify or weigh these factors. This interesting study highlights how one of these factors (prior screening history) and fecal hemoglobin level (an emerging factor) are powerful stratifiers of subsequent colorectal cancer risk,” said Sameer D. Saini, MD, AGAF, director and research investigator at the VA Ann Arbor Healthcare System’s Center for Clinical Management Research. Dr. Saini wasn’t involved with the study.

Dr. Sameer D. Saini, director and research investigator at the VA Ann Arbor Healthcare System's Center for Clinical Management Research
Dr. Sameer D. Saini

At the clinical level, Dr. Saini said, sophisticated modeling is needed to understand the interaction with competing risks and identify the optimal screening strategies for patients at varying levels of cancer risk and life expectancy. Models could also help to quantify the population benefits and cost-effectiveness of personalized screening.

“Finally, it is important to note that, in many health systems, access to quantitative FIT may be limited,” he said. “These data may be less informative if colonoscopy is the primary mode of screening.”

Ms. van Stigt and Dr. Saini reported no relevant disclosures.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2024

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Statins, Vitamin D, and Exercise in Older Adults

Article Type
Changed
Mon, 07/29/2024 - 15:09

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

Publications
Topics
Sections

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

In this article, I will review several recently published articles and guidelines relevant to the care of older adults in primary care. The articles of interest address statins for primary prevention, vitamin D supplementation and testing, and physical activity for healthy aging.
 

Statins for Primary Prevention of Cardiovascular Disease

A common conundrum in primary care is whether an older adult should be on a statin for primary prevention. This question has been difficult to answer because of the underrepresentation of older adults in clinical trials that examine the effect of statins for primary prevention. A recent study by Xu et al. published in Annals of Internal Medicine sought to address this gap in knowledge, investigating the risks and benefits of statins for primary prevention for older adults.1

This study stratified participants by “old” (aged 75-84 years) and “very old” (85 years or older). In this study, older adults who had an indication for statins were initiated on statins and studied over a 5-year period and compared with age-matched cohorts not initiated on statin therapy. Participants with known cardiovascular disease at baseline were excluded. The outcomes of interest were major cardiovascular disease (CVD) (a composite of myocardial infarction, stroke, or heart failure), all-cause mortality, and adverse effect of drug therapy (myopathy or liver dysfunction).

The study found that among older adults aged 75-84, initiation of statin therapy led to a 1.2% risk reduction in major CVD over a 5-year period. For older adults aged 85 and greater, initiation of statins had an even larger impact, leading to a 4.4% risk reduction in major CVD over a 5-year period. The study found that there was no significant difference in adverse effects including myopathy or liver dysfunction in both age groups.

Statins, the study suggests, are appropriate and safe to initiate for primary prevention in older adults and can lead to substantial benefits in reduction of CVD. While time to benefit was not explicitly examined in this study, a prior study by Yourman et al. suggested that the time to benefit for statins for primary prevention in adults aged 50-75 would be least 2.5 years.2

My takeaway from these findings is to discuss statin initiation for primary prevention for older patients who are focused on longevity, have good functional status (often used in geriatrics as a proxy for prognosis), and are willing to accept more medications.
 

Empiric Vitamin D Supplementation in Adults over 75 Years

Vitamin D is one of the most common supplements taken by older adults but evidence supporting vitamin D supplementation is variable in published literature, as most data comes from observational trials. New guidelines from the Endocrine Society focused on developing recommendations for healthy individuals with data obtained from randomized controlled trials (RCTs) and large longitudinal observational trials with comparison groups if RCTs were not available. These guidelines recommend against empiric supplementation of vitamin D for healthy adults aged 18-74, excluding pregnant women and patients with high-risk diabetes.3

For older adults aged 75 or greater, empiric vitamin D supplementation is recommended because of the possible reduction of risk in all-cause mortality in this population. Of note, this was a grade 2 recommendation by the panel, indicating that the benefits of the treatment probably outweigh the risks. The panel stated that vitamin D supplementation could be delivered through fortified foods, multivitamins with vitamin D, or as a separate vitamin D supplement.

The dosage should remain within the recommended daily allowance outlined by the Institute of Medicine of 800 IU daily for adults over 70, and the panel recommends low-dose daily vitamin D supplementation over high-dose interval supplementation. The panel noted that routine screening of vitamin D levels should not be used to guide decision-making on whether to start supplementation, but vitamin D levels should be obtained for patients who have an indication for evaluation.

The reviewers highlight that these guidelines were developed for healthy individuals and are not applicable to those with conditions that warrant vitamin D evaluation. In my clinical practice, many of my patients have bone-mineral conditions and cognitive impairment that warrant evaluation. Based on these guidelines, I will consider empiric vitamin D supplementation more often for healthy patients aged 75 and older.
 

 

 

Sedentary Behaviors and Healthy Aging

Engaging inactive older adults in regular physical activity can be challenging, particularly as the pandemic has led to more pervasive social isolation and affected the availability of in-person exercise activities in the community. Physical activity is a key component of healthy aging and cognition, and its benefits should be a part of routine counseling for older adults.

An interesting recent study published in JAMA Network Open by Shi et al. evaluated the association of health behaviors and aging in female US nurses over a 20-year period.4 Surveys were administered to capture time spent in each behavior, such as being sedentary (TV watching, sitting at home or at work), light activity (walking around the house or at work), and moderate to vigorous activity (walking for exercise, lawn mowing). “Healthy aging” was defined by the absence of chronic conditions such as heart failure, and lack of physical, mental, and cognitive impairment.

The study found that participants who were more sedentary were less likely to age healthfully, with each additional 2 hours of TV watching per day associated with a 12% reduction in likelihood of healthy aging. Light physical activity was associated with a significant increase in healthy aging, with a 6% increase in the likelihood of healthy aging for each additional 2 hours of light activity. Each additional 1 hour of moderate to vigorous activity was associated with a 14% increase in the likelihood of healthy aging. These findings support discussions with patients that behavior change, even in small increments, can be beneficial in healthy aging.
 

References

1. Xu W et al. Ann Intern Med. 2024 Jun;177(6):701-10.

2. Yourman LC et al. JAMA Intern Med. 2021;181:179-85.

3. Demay MB et al. J Clin Endocrinol Metab. August 2024;109(8):1907-47.

4. Shi H et al. JAMA Netw Open. 2024;7(6):e2416300.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Atogepant May Prevent Rebound Headache From Medication Overuse in Chronic Migraine

Article Type
Changed
Mon, 07/29/2024 - 15:15

The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.

Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.

“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.

The study was published online in Neurology.
 

Effective Prevention Needed

Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.

Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”

“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”

Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.

The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.

Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.

Participants were asked to record migraine and headache experiences in an electronic diary.
 

‘Effective and Safe’

Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.

MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.

A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).

Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.

Similar results were observed in the subgroup without acute medication overuse.

Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.

A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.

Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.

AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.

Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.

“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.

The study was published online in Neurology.
 

Effective Prevention Needed

Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.

Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”

“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”

Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.

The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.

Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.

Participants were asked to record migraine and headache experiences in an electronic diary.
 

‘Effective and Safe’

Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.

MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.

A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).

Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.

Similar results were observed in the subgroup without acute medication overuse.

Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.

A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.

Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.

AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.

A version of this article first appeared on Medscape.com.

The oral calcitonin gene-related peptide receptor antagonist atogepant is effective in preventing rebound headache related to medication overuse in patients with chronic migraine (CM), new research suggested.

Results of a subgroup analysis of a phase 3, 12-week randomized, double-blind, placebo-controlled trial showed up to a 62% reduction in the proportion of atogepant-treated participants who met acute medication overuse criteria.

“Based on our findings, treatment with atogepant may potentially decrease the risk of developing rebound headache by reducing the use of pain medications,” principal investigator Peter Goadsby, MD, PhD, of King’s College London, London, England, said in a news release.

The study was published online in Neurology.
 

Effective Prevention Needed

Acute treatments for migraine can mitigate symptoms and reduce disability but can also be ineffective and even result in increased dosing and overuse of these medications, the investigators noted.

Acute medication overuse is defined as “taking simple analgesics for ≥ 15 days per month or taking triptans, ergots, opioids, or combinations of medications for ≥ 10 days per month.”

“There is a high prevalence of pain medication overuse among people with migraine as they try to manage what are often debilitating symptoms,” Dr. Goadsby said. “However, medication overuse can lead to more headaches, called rebound headaches, so more effective preventive treatments are needed.”

Atogepant was developed for migraine prevention in adults. It had been studied in the phase 3 PROGRESS trial, which showed it significantly reduced monthly migraine days (MMDs) compared with placebo during the 12-week trial.

The new subgroup analysis of the study focused specifically on the efficacy and safety of atogepant vs placebo in participants with CM with, and without, medication overuse.

Participants (mean age, 42.1 years; 87.6% women) were randomized to receive either atogepant 30 mg twice daily (n = 253), atogepant 60 mg once daily (n = 256), or placebo (n = 240), with baseline demographics and clinical characteristics similar across all treatment arms. A total of 66.2% met baseline acute medication overuse criteria.

Participants were asked to record migraine and headache experiences in an electronic diary.
 

‘Effective and Safe’

Participants in both atogepant groups experienced fewer monthly headache days (MHDs) than those in the placebo group, with a least squares mean difference (LSMD) of −2.7 (95% confidence interval [CI], −4.0 to −1.4) in the atogepant 30 mg twice daily group and −1.9 (95% CI, −3.2 to −0.6) in the atogepant 60 mg once daily group.

MHDs were also reduced in both treatment groups, with LSMDs of −2.8 (95% CI, −4.0 to −1.5) and −2.1 (95% CI, −3.3 to −0.8), respectively. Mean acute medication use days were lower in both the treatment groups, with LSMDs of −2.8 (95% CI, −4.1 to −1.6) and −2.6 (95% CI, −3.9 to −1.3), respectively.

A higher proportion of participants achieved a ≥ 50% reduction in MMDs with atogepant 30 mg twice daily (odds ratio [OR], 2.5; 95% CI, 1.5-4.0) and atogepant 60 mg once daily (OR, 2.3; 95% CI, 1.4-3.7).

Notably, the researchers found a 52.1%-61.9% reduction in the proportion of atogepant-treated participants meeting acute medication overuse criteria during the study period vs 38.3% in the placebo group.

Similar results were observed in the subgroup without acute medication overuse.

Treatment-emergent adverse events were reported by 55.8% of participants treated with atogepant 30 mg twice daily, 66.1% with atogepant 60 mg once daily, and 48.5% with placebo in the acute medication overuse subgroup, with similar reports in the non-overuse subgroup.

A limitation cited by the authors was that participants’ self-report of migraines and headaches via electronic diaries might have been inaccurate.

Nevertheless, they concluded that the results showed atogepant to be an “effective and safe” preventive treatment for patients with CM with, and without, acute medication overuse.

AbbVie funded this study and participated in the study design, research, analysis, data collection, interpretation of data, reviewing, and approval of the publication. No honoraria or payments were made for authorship. Dr. Goadsby received personal fees from AbbVie during the conduct of the study, and over the last 36 months, he received a research grant from Celgene; personal fees from Aeon Biopharma, Amgen, CoolTechLLC, Dr. Reddy’s, Eli Lilly and Company, Epalex, Lundbeck, Novartis, Pfizer, Praxis, Sanofi, Satsuma, ShiraTronics, Teva Pharmaceuticals, and Tremeau; personal fees for advice through Gerson Lehrman Group, Guidepoint, SAI Med Partners, and Vector Metric; fees for educational materials from CME Outfitters; and publishing royalties or fees from Massachusetts Medical Society, Oxford University Press, UpToDate, and Wolters Kluwer. The other authors’ disclosures are listed on the original paper.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM NEUROLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

TBI Significantly Increases Mortality Rate Among Veterans With Epilepsy

Article Type
Changed
Thu, 07/18/2024 - 10:11

Veterans diagnosed with epilepsy have a significantly higher mortality rate if they experience a traumatic brain injury either before or within 6 months of an epilepsy diagnosis, according to recent research published in Epilepsia.

In a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.

Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.

Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).

There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.

After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.

“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.

The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.

“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
 

 

 

Reevaluating the Treatment of Epilepsy

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York
Northwell Health
Dr. Juliann Paolicchi


The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”

The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”

In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”

The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.

The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.

Publications
Topics
Sections

Veterans diagnosed with epilepsy have a significantly higher mortality rate if they experience a traumatic brain injury either before or within 6 months of an epilepsy diagnosis, according to recent research published in Epilepsia.

In a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.

Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.

Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).

There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.

After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.

“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.

The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.

“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
 

 

 

Reevaluating the Treatment of Epilepsy

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York
Northwell Health
Dr. Juliann Paolicchi


The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”

The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”

In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”

The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.

The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.

Veterans diagnosed with epilepsy have a significantly higher mortality rate if they experience a traumatic brain injury either before or within 6 months of an epilepsy diagnosis, according to recent research published in Epilepsia.

In a retrospective cohort study, Ali Roghani, PhD, of the division of epidemiology at the University of Utah School of Medicine in Salt Lake City, and colleagues evaluated 938,890 veterans between 2000 and 2019 in the Defense Health Agency and the Veterans Health Administration who served in the US military after the September 11 attacks. Overall, 27,436 veterans met criteria for a diagnosis of epilepsy, 264,890 had received a diagnosis for a traumatic brain injury (TBI), and the remaining patients had neither epilepsy nor TBI.

Among the veterans with no epilepsy, 248,714 veterans had a TBI diagnosis, while in the group of patients with epilepsy, 10,358 veterans experienced a TBI before their epilepsy diagnosis, 1598 were diagnosed with a TBI within 6 months of epilepsy, and 4310 veterans had a TBI 6 months after an epilepsy diagnosis. The researchers assessed all-cause mortality in each group, calculating cumulative mortality rates compared with the group of veterans who had no TBI and no epilepsy diagnosis.

Dr. Roghani and colleagues found a significantly higher mortality rate among veterans who developed epilepsy compared with a control group with neither epilepsy nor TBI (6.26% vs. 1.12%; P < .01), with a majority of veterans in the group who died being White (67.4%) men (89.9%). Compared with veterans who were deceased, nondeceased veterans were significantly more likely to have a history of being deployed (70.7% vs. 64.8%; P < .001), were less likely to be in the army (52.2% vs. 55.0%; P < .001), and were more likely to reach the rank of officer or warrant officer (8.1% vs. 7.6%; P = .014).

There were also significant differences in clinical characteristics between nondeceased and deceased veterans, including a higher rate of substance abuse disorder, smoking history, cardiovascular disease, stroke, transient ischemic attack, cancer, liver disease, kidney disease, or other injury as well as overdose, suicidal ideation, and homelessness. “Most clinical conditions were significantly different between deceased and nondeceased in part due to the large cohort size,” the researchers said.

After performing Cox regression analyses, the researchers found a higher mortality risk in veterans with epilepsy and/or TBIs among those who developed a TBI within 6 months of an epilepsy diagnosis (hazard ratio [HR], 5.02; 95% CI, 4.21-5.99), had a TBI prior to epilepsy (HR, 4.25; 95% CI, 3.89-4.58), had epilepsy alone (HR, 4.00; 95% CI, 3.67-4.36), had a TBI more than 6 months after an epilepsy diagnosis (HR, 2.49; 95% CI, 2.17-2.85), and those who had epilepsy alone (HR, 1.30; 95% CI, 1.25-1.36) compared with veterans who had neither epilepsy nor a TBI.

“The temporal relationship with TBI that occurred within 6 months after epilepsy diagnosis may suggest an increased vulnerability to accidents, severe injuries, or TBI resulting from seizures, potentially elevating mortality risk,” Dr. Roghani and colleagues wrote.

The researchers said the results “raise concerns” about the subgroup of patients who are diagnosed with epilepsy close to experiencing a TBI.

“Our results provide information regarding the temporal relationship between epilepsy and TBI regarding mortality in a cohort of post-9/11 veterans, which highlights the need for enhanced primary prevention, such as more access to health care among people with epilepsy and TBI,” they said. “Given the rising incidence of TBI in both the military and civilian populations, these findings suggest close monitoring might be crucial to develop effective prevention strategies for long-term complications, particularly [post-traumatic epilepsy].”
 

 

 

Reevaluating the Treatment of Epilepsy

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York, who was not involved with the study, said in an interview that TBIs have been studied more closely since the beginning of conflicts in the Middle East, particularly in Iran and Afghanistan, where “newer artillery causes more diffuse traumatic injury to the brain and the body than the effects of more typical weaponry.”

Juliann Paolicchi, MD, a neurologist and member of the epilepsy team at Northwell Health in New York
Northwell Health
Dr. Juliann Paolicchi


The study by Roghani and colleagues, she said, “is groundbreaking in that it looks at the connection and timing of these two disruptive forces, epilepsy and TBI, on the brain,” she said. “The study reveals that timing is everything: The combination of two disrupting circuitry effects in proximity can have a deadly effect. The summation is greater than either alone in veterans, and has significant effects on the brain’s ability to sustain the functions that keep us alive.”

The 6 months following either a diagnosis of epilepsy or TBI is “crucial,” Dr. Paolicchi noted. “Military and private citizens should be closely monitored during this period, and the results suggest they should refrain from activities that could predispose to further brain injury.”

In addition, current standards for treatment of epilepsy may need to be reevaluated, she said. “Patients are not always treated with a seizure medication after a first seizure, but perhaps, especially in patients at higher risk for brain injury such as the military and athletes, that policy warrants further examination.”

The findings by Roghani and colleagues may also extend to other groups, such as evaluating athletes after a concussion, patients after they are in a motor vehicle accident, and infants with traumatic brain injury, Dr. Paolicchi said. “The results suggest a reexamining of the proximity [of TBI] and epilepsy in these and other areas,” she noted.

The authors reported personal and institutional relationships in the form of research support and other financial compensation from AbbVie, Biohaven, CURE, Department of Defense, Department of Veterans Affairs (VA), Eisai, Engage, National Institutes of Health, Sanofi, SCS Consulting, Sunovion, and UCB. This study was supported by funding from the Department of Defense, VA Health Systems, and the VA HSR&D Informatics, Decision Enhancement, and Analytic Sciences Center of Innovation. Dr. Paolicchi reports no relevant conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM EPILEPSIA

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Study Estimates Global Prevalence of Seborrheic Dermatitis

Article Type
Changed
Wed, 07/17/2024 - 10:52

 

TOPLINE:

Seborrheic dermatitis affects an estimated 4% of the global population, with significant variations across age groups, settings, and regions, according to a meta-analysis that also found a higher prevalence in adults than in children.

METHODOLOGY:

  • Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
  • The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
  • The primary outcome was the pooled prevalence of seborrheic dermatitis.

TAKEAWAY:

  • The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
  • The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
  • A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.

IN PRACTICE:

The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.

SOURCE:

The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.

LIMITATIONS:

Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.

DISCLOSURES:

Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Seborrheic dermatitis affects an estimated 4% of the global population, with significant variations across age groups, settings, and regions, according to a meta-analysis that also found a higher prevalence in adults than in children.

METHODOLOGY:

  • Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
  • The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
  • The primary outcome was the pooled prevalence of seborrheic dermatitis.

TAKEAWAY:

  • The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
  • The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
  • A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.

IN PRACTICE:

The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.

SOURCE:

The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.

LIMITATIONS:

Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.

DISCLOSURES:

Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

Seborrheic dermatitis affects an estimated 4% of the global population, with significant variations across age groups, settings, and regions, according to a meta-analysis that also found a higher prevalence in adults than in children.

METHODOLOGY:

  • Researchers conducted a meta-analysis of 121 studies, which included 1,260,163 people with clinician-diagnosed seborrheic dermatitis.
  • The included studies represented nine countries; most were from India (n = 18), Turkey (n = 13), and the United States (n = 8).
  • The primary outcome was the pooled prevalence of seborrheic dermatitis.

TAKEAWAY:

  • The overall pooled prevalence of seborrheic dermatitis was 4.38%, 4.08% in clinical settings, and 4.71% in the studies conducted in the general population.
  • The prevalence of seborrheic dermatitis was higher among adults (5.64%) than in children (3.7%) and neonates (0.23%).
  • A significant variation was observed across countries, with South Africa having the highest prevalence at 8.82%, followed by the United States at 5.86% and Turkey at 3.74%, while India had the lowest prevalence at 2.62%.

IN PRACTICE:

The global prevalence in this meta-analysis was “higher than previous large-scale global estimates, with notable geographic and sociodemographic variability, highlighting the potential impact of environmental factors and cultural practices,” the authors wrote.

SOURCE:

The study was led by Meredith Tyree Polaskey, MS, Chicago Medical School, Rosalind Franklin University of Medicine and Science, North Chicago, Illinois, and was published online on July 3, 2024, in the JAMA Dermatology.

LIMITATIONS:

Interpretation of the findings is limited by research gaps in Central Asia, much of Sub-Saharan Africa, Eastern Europe, Southeast Asia, Latin America (excluding Brazil), and the Caribbean, along with potential underreporting in regions with restricted healthcare access and significant heterogeneity across studies.

DISCLOSURES:

Funding information was not available. One author reported serving as an advisor, consultant, speaker, and/or investigator for multiple pharmaceutical companies, including AbbVie, Amgen, and Pfizer.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article