It turns out that too much TV might damage your brain and also raise the risk of developing Alzheimer’s disease — and that the effects could show up much sooner than previously believed, a new study suggests.
Researchers at the Northern California Institute for Research and Education in San Francisco who investigated the association between sedentary lifestyles, cognitive performance and the risk of developing dementia found that people who watched a lot of television — namely, four hours or more per day — scored significantly lower on measures of cognitive performance in middle age.
The study, which tracked people for 25 years beginning in young adulthood, found that people who also reported low levels of physical activity performed worse on cognitive tests.
Kristine Yaffe, a professor of psychiatry, neurology and epidemiology at the University of California in San Francisco, said the results have important implications for children and young adults, who are more than ever glued to the screens of electronic gadgets as part of a sedentary life at home and in the workplace.
But Yaffe also said the research bears a hopeful message, too: It appears likely that people can lower their risk of later cognitive decline and perhaps developing dementia by changing their lifestyle.
“This is something you can do something about,” Yaffe told reporters at a briefing early Monday at the annual Alzheimer’s Association International Conference in Washington.
In a similar vein, a separate study presented Monday pointed to the risks of loneliness and social isolation.
The findings occur as shifting demographics are raising the median age in the United States and several other developing countries. More than 28 million baby boomers are projected to develop Alzheimer’s by 2050, and more than 5 million people are living with the disease now, according to the Alzheimer’s Association. The cost of care will consume about 24 percent of Medicare funding by 2040, up from an estimated 2.1 percent of Medicare in 2020.
The California study, which investigated physical activity and television-viewing habits and their impact on cognitive function, examined 3,247 adults who were 18- to 30-years-old when they enrolled in the Coronary Artery Risk Development in Young Adults Study.
Their exercise habits and TV viewing were evaluated using questionnaires three times during the course of 25 years. Low physical activity was defined as burning fewer than 300 calories in a 50-minute session three times a week — which, by at least one measure, is about 100 calories less than the equivalent of playing a round of golf while riding in a golf cart.
A high amount of television watching was defined as more than four hours a day. If a person met those thresholds on two of three follow-up visits, they were deemed to have a long-term pattern of low physical activity or television-viewing.
About 17 percent reported low physical activity, and about 11 percent qualified as heavy TV viewers; 3 percent reported both, the researchers said.
An analysis of the results showed that people who watch a lot of television had a 1.5 percent higher risk of performing worse on cognitive tests compared with those who watched less television, Yaffe said.
Compared with participants with high physical activity and low television viewing, a relatively sedentary person who exercises little and spends a lot of time in front of the television will be two times more likely to perform poorly on cognitive tests in midlife, the study found.
Those results suggest that sedentary habits set early in life can perhaps have an impact on one’s dementia risk in midlife and later.
“What’s happening at one’s midlife is setting the stage for what’s happening over the next 20 or 30 years,” Yaffe said. Yet less than half the nation meets recommended exercise standards.
Meanwhile, another new study lent further support to the belief that loneliness can raise the risks of developing dementia, independent of one’s actual social network.
“The prevailing view is that loneliness is a form of psychosocial stress,” said Nancy J. Donovan, an associate psychiatrist at Brigham and Women’s Hospital and Harvard Medical School.
Donovan said her team tracked 8,311 adults in the U.S. Health and Retirement Study from 1998 to 2010. The participants, who were 65 and older, were given biennial assessments of their perception of loneliness using a questionnaire. The researchers examined the participants’ cognitive performance and factored in their health status, sociodemographics status and social network characteristics.
What the team found is that the loneliest people — about 1 in 6 participants — experienced the most accelerated decline in cognitive performance. Their scores fell 20 percent faster than those who did not report feeling lonely.
Donovan said the findings underscored the need to address social isolation among older people.
“First, loneliness is a form of suffering in older people that is prevalent but undetected and untreated in medical practice,” she said. “Second, loneliness has consequences. Our work shows that loneliness, like depression, is associated with accelerated cognitive decline in older Americans. This finding is important because it opens up new approaches for preventing and treating Alzheimer’s disease.”