Prostate cancer in second and third-degree relatives elevates risk

One long-ago summer, I joined the legion of teens helping harvest our valley’s peach crop in western Colorado. My job was to select the best peaches from a bin, wrap each one in tissue, and pack it into a shipping crate. The peach fuzz that coated every surface of the packing shed made my nose stream and my eyelids swell. When I came home after my first day on the job, my mother was so alarmed she called the family doctor. Soon the druggist was at the door with a vial of Benadryl (diphenhydramine) tablets. The next morning I was back to normal and back on the job. Weeks later, when I collected my pay (including the ½-cent-per-crate bonus for staying until the end of the harvest), I thanked Benadryl.

Today, I’m thankful my need for that drug lasted only a few weeks. In a report published in JAMA Internal Medicine, researchers offers compelling evidence of a link between long-term use of anticholinergic medications like Benadryl and dementia.

Anticholinergic drugs block the action of acetylcholine. This substance transmits messages in the nervous system. In the brain, acetylcholine is involved in learning and memory. In the rest of the body, it stimulates muscle contractions. Anticholinergic drugs include some antihistamines, tricyclic antidepressants, medications to control overactive bladder, and drugs to relieve the symptoms of Parkinson’s disease.
What the study found

A team led by Shelley Gray, a pharmacist at the University of Washington’s School of Pharmacy, tracked nearly 3,500 men and women ages 65 and older who took part in Adult Changes in Thought (ACT), a long-term study conducted by the University of Washington and Group Health, a Seattle healthcare system. They used Group Health’s pharmacy records to determine all the drugs, both prescription and over-the-counter, that each participant took the 10 years before starting the study. Participants’ health was tracked for an average of seven years. During that time, 800 of the volunteers developed dementia. When the researchers examined the use of anticholinergic drugs, they found that people who used these drugs were more likely to have developed dementia as those who didn’t use them. Moreover, dementia risk increased along with the cumulative dose. Taking an anticholinergic for the equivalent of three years or more was associated with a 54% higher dementia risk than taking the same dose for three months or less.

The ACT results add to mounting evidence that anticholinergics aren’t drugs to take long-term if you want to keep a clear head, and keep your head clear into old age. The body’s production of acetylcholine diminishes with age, so blocking its effects can deliver a double whammy to older people. It’s not surprising that problems with short-term memory, reasoning, and confusion lead the list of side effects of anticholinergic drugs, which also include drowsiness, dry mouth, urine retention, and constipation.

The University of Washington study is the first to include nonprescription drugs. It is also the first to eliminate the possibility that people were taking a tricyclic antidepressant to alleviate early symptoms of undiagnosed dementia; the risk associated with bladder medications was just as high.

“This study is another reminder to periodically evaluate all of the drugs you’re taking. Look at each one to determine if it’s really helping,” says Dr. Sarah Berry, a geriatrician and assistant professor of medicine at Harvard Medical School. “For instance, I’ve seen people who have been on anticholinergic medications for bladder control for years and they are completely incontinent. These drugs obviously aren’t helping.”

Many drugs have a stronger effect on older people than younger people. With age, the kidneys and liver clear drugs more slowly, so drug levels in the blood remain higher for a longer time. People also gain fat and lose muscle mass with age, both of which change the way that drugs are distributed to and broken down in body tissues. In addition, older people tend to take more prescription and over-the-counter medications, each of which has the potential to suppress or enhance the effectiveness of the others.
What should you do?

In 2008, Indiana University School of Medicine geriatrician Malaz Boustani developed the anticholinergic cognitive burden scale, which ranks these drugs according to the severity of their effects on the mind. It’s a good idea to steer clear of the drugs with high ACB scores, meaning those with scores of 3. “There are so many alternatives to these drugs,” says Dr. Berry. For example, selective serotonin re-uptake inhibitors (SSRIs) like citalopram (Celexa) or fluoxetine (Prozac) are good alternatives to tricyclic antidepressants. Newer antihistamines such as loratadine (Claritin) can replace diphenhydramine or chlorpheniramine (Chlor-Trimeton). Botox injections and cognitive behavioral training can alleviate urge incontinence.

One of the best ways to make sure you’re taking the most effective drugs is to dump all your medications — prescription and nonprescription — into a bag and bring them to your next appointment with your primary care doctor. As is true for so many types of cancer, prostate cancer is caused partly by environmental factors and partly by genetic factors. Men are at greater risk for developing prostate cancer if their fathers or brothers also developed the disease. But does the risk also increase if a man’s more distant relatives had prostate cancer?

A study published in The Prostate provides an answer. Using a database covering more than 100 years, Utah investigators quantified prostate cancer risk according to three levels of relatedness:senior-man-talking-to-doctor

    first degree relatives, such as parents, full siblings, and children
    second degree relatives, such as grandparents, grandchildren, uncles, nephews, or half-siblings, and
    third-degree relatives, such great grandparents, great grandchildren, and first cousins.

As expected, men who had first-degree family members with prostate cancer faced the highest relative risk: estimates ranged from 2.5 times higher if one first-degree relative was affected to nearly 8 times higher if four first-degree relatives had prostate cancer. But prostate cancer in more distant relatives was also a risk factor.

A diagnosis in a man’s uncles (second-degree relatives), for instance, roughly doubled his risk. Having third-degree relatives with prostate cancer also increases a man’s risk of developing the disease. No added risks were noted for men who had a fourth-degree relative with prostate cancer.

Lisa Albright, a statistical geneticist at the University of Utah School of Medicine who led the study, said the risks were similar regardless of whether the genes were passed through the mother or the father. “This isn’t something people normally think about,” she said. “But if the mother’s brother had prostate cancer, then the nephew is also at greater risk.”

To generate the results, Albright’s team scoured two data repositories—a genealogy database for Utah residents dating back to the 1880s, and a cancer registry for the state that dates back to 1966. The two databases were linked, allowing the investigators to assess family history of prostate cancer over multiple generations.

Risks for dozens of familial combinations were quantified. However, the critical findings appear in a single table “that we hope clinicians will laminate and keep in a back pocket,” Albright said. Presented here, the table describes the kinds of family histories that either double or triple the relative risk of prostate cancer.
Family history constellations that double or triple prostate cancer risk
Relative risk greater than 2 (26% of males)     Relative risk greater than 3 (10% of males)
1 affected first degree relative     2 or more affected first-degree relatives
3 or more affected second-degree relatives     5 or more affected second-degree relatives
Mother’s father affected     Both grandfathers affected
Nephew affected     1 or more affected first-degree relatives and 2 or more affected second-degree relatives
Maternal and paternal uncles affected     1 or more affected first-degree relatives diagnosed before age 70

Adapted from Albright F, Stephenson RA, Agarwal N, Teerlink CC, Lowrance WT, Farnham JM, Cannon Albright LA. Prostate cancer risk prediction based on complete prostate cancer family history. The Prostate 2014, DOI: 10.1002/pros.22925

Because the analysis is limited to Utah residents—mostly of Northern European descent—it may not account for the effect of family history in other races or ethnicities, Albright acknowledged. “We hope to collaborate with others elsewhere who have similar resources to work with, ” he said.

Still, the results indicate that “individuals who have certain family history patterns may deserve more careful scrutiny with respect to screening,” said co-author Robert Stevenson, a urological surgeon at the University of Utah School of Medicine. Screening means checking apparently healthy individuals for signs of hidden disease. For prostate cancer, this is usually done with the prostate-specific antigen (PSA) blood test.

The US Preventive Services Task Force recommends against routine PSA testing. But it has said that additional research is needed to determine whether the balance of benefits and harms of prostate cancer screening differs in men at higher risk of developing or dying of prostate cancer.

“This is a unique data set that provides a greater degree of precision of prostate cancer risk,” said Dr. Marc Garnick, the Gorman Brothers Professor of Medicine at Harvard Medical School and Beth Israel Deaconess Medical Center. “The real dilemma, which this study does not address, is whether there are any meaningfully beneficial outcomes to those individuals who are diagnosed (after being identified as being at high risk of having the disease) and treated. Nevertheless identification of these at risk patients should eventually help elucidate the genes that may be responsible for the development of prostate cancer.” Warnings against eating foods high in cholesterol, like eggs or shrimp, have been a mainstay of dietary recommendations for decades. That could change if the scientific advisory panel for the 2015 iteration of the Dietary Guidelines for Americans has its say.

A summary of the committee’s December 2014 meeting says “Cholesterol is not considered a nutrient of concern for overconsumption.” Translation: You don’t need to worry about cholesterol in your food.

Why not? There’s a growing consensus among nutrition scientists that cholesterol in food has little effect on the amount of cholesterol in the bloodstream. And that’s the cholesterol that matters.

Nutrition experts like Dr. Walter C. Willett, chair of the Department of Nutrition at Harvard School of Public Health, called the plan a reasonable move. Dr. Steven Nissen, chair of cardiovascular medicine at the Cleveland Clinic, told USA Today “It’s the right decision. We got the dietary guidelines wrong.”

Keep in mind that this isn’t a done deal. The panel, which is formally known as the 2015 Dietary Guidelines Advisory Committee, makes recommendations for the next guidelines update, but these recommendations aren’t always followed.
The cholesterol connection

Cholesterol has a bad reputation, its name linked to heart attacks, strokes, and other types of cardiovascular disease. Yet cholesterol is as necessary for human health as water or air.

Cholesterol is a type of fat, or lipid. It is an essential building block for cell membranes and other crucial structures. It is needed to form the protective sheath that surrounds nerve fibers. The body uses cholesterol to make hormones such as testosterone and estrogen, the bile acids we need to digest and absorb fats, and vitamin D.

Cholesterol is so important that your liver and intestines make it day and night from fats, sugars, and proteins. In the average person, the body’s production of cholesterol far outstrips any contribution from cholesterol in food.

Why is blood cholesterol a concern? Too much of it, especially in the wrong kind of particle, can cause trouble inside blood vessels (see “From cholesterol to crisis” below). Harmful low-density lipoprotein (LDL) particles ferry cholesterol to artery walls. Protective high-density lipoprotein (HDL) particles pull cholesterol out of circulation and deliver it to the liver for destruction.

Doing away with the beware-cholesterol-in-food warning would simplify the art of choosing healthy foods. And it would let people enjoy foods that contain higher amounts of cholesterol, such as eggs, shrimp, and lobster, without worrying about it. A better focus is on reducing saturated fat and trans fat in the diet, which play greater roles in damaging blood vessels than dietary cholesterol.

Science, including nutrition science, is a process of change. New findings emerge that nudge aside old thinking and prompt new recommendations. That’s easy for someone like me to say, since I closely follow nutrition science and research and understand how they work. But for folks who don’t, a change in the recommendations about cholesterol in food is likely to be seen as another dietary flip-flop and undermine confidence in what’s known about healthy eating.

Comments

Popular posts from this blog

What is the magic sleep number

Turning to drugs and treatments for better health

Radiation for breast cancer can increase heart risks