In 1970, the prevalence of attention deficit hyperactivity disorder was around 1 percent for school-aged children, and it was very rare for a teenager or adult to be diagnosed with the condition. Now, CDC statistics show that 11 percent of children between 4 and 17 have been diagnosed with ADHD at some point in their lives. Moreover, research has shown that 60 percent of children diagnosed with the condition take it forward with them into adulthood.
There are several factors that contributed to the dramatic increase of ADHD over the past 40 years — and the related upsurge in stimulant medication prescriptions – including better diagnostic tools, decreased stigma and heightened awareness. Still, the unfolding parallel between the growth of technology in our lives and the increase of ADHD cannot be ignored, and in my opinion, “Star Wars” is an inescapable data point.
The first of George Lucas’s space epics was released in the spring of 1977. Its memorable characters, sweeping saga and blazing special effects all played an obvious role in its massive success and enduring impact. But less attention has been paid to one truly revolutionary quality: its groundbreaking editing.
Before “Star Wars,” movies moved at a slower tempo. The camera might hold a shot for several minutes at a time, with individual scenes playing out in something like our own real time. George Lucas changed all of that with fast-paced cuts and accelerated, rhythmic editing. In his book, “The Editing of Star Wars: How Cutting Created a Classic,” author Linton Davies notes that the film contained 2,177 cuts – far more than, say, “Annie Hall,” “Saturday Night Fever” or any of the other big movies of 1977.
The rapid pace of these cuts amped up Lucas’s narrative, Davies wrote: “A quickly cut sequence increases the perceived significance of what happens on screen, not least because it paradoxically tends to increase the overall duration of the sequence by repeating various actions from a number of different angles and perspectives.”
For example, in the scene where rebels overrun the prison block in a quick-fire gun battle, the Oscar-winning editor Paul Hirsch wove together a series of shots with an average length of a mere two seconds. Not much actually happens in this sequence, Davies wrote – the rebels shot down the guards and security cameras. But “the extremely quick editing creates a feeling of confusion and disorientation,” and the editors enhanced the drama by presenting the same moment of action with shots from multiple angles. “In fact,” Davies wrote, “the same shot of a camera being smashed is used three times in fifteen seconds, but the frantic pace of the scene means that the repetition goes unnoticed by the viewer. This is a turning point in the film, and the chaotic editing style reflects this feel.”
There is no denying that life after the first “Star Wars” got faster and that with each passing decade the pace has continued to accelerate. We see this especially with our modes of entertainment.
It was after 1977 that more of our entertainment hours became bound up in technology – most notably with the rise of video games, whether with Atari’s home consoles or the PacMan-era golden age of arcades. Is it a coincidence that the diagnoses of ADHD increased in the 1980s alongside these products coming onto the scene – not to mention the first generation of computers entering our homes? Is it also a coincidence that many children and teens with ADHD enjoy playing video games and that these games have become increasingly fast paced and over-stimulating with time?
And how we watch TV has changed quite a bit since the first “Star Wars.” In 1977, when folks watched the evening news, they typically saw an anchor reading out loud from behind a modest desk. Today, however, if you turn on CNN or Fox News your frontal lobe is bombarded with information coming at you from many angles – not only multiple cuts to journalists out in the field, but also talking heads pundits piped in from distant studios, fast-moving graphics and sound-effect swooshes, and the ubiquitous running “crawl” of completely unrelated news scrolled across the bottom of the screen.
Even MTV – a thrilling new medium when it launched four years after “Star Wars” — has adjusted its programming dramatically to remain relevant. Full-length music videos are mostly a thing of the past, possibly because today’s teens and young adults don’t have the attention span for a three-minute song and are more inclined to spend that time on their phones or computers.
So what does the research on this topic show? According to Dimitri Christakis, the Director of the Center for Child Health at Seattle Children’s Hospital/Research/Foundation, in 1970 children started watching TV around 4 years of age compared to children today who begin watching TV at around 4 months of age. Today, children 5 years of age engage in four and half hours of screen time daily, which is as much as 40 percent of a young child’s waking hours. Christakis asserts that the more TV children watch before the age of 3 years, the more likely these children are to experience attentional problems at school age and around 7 years.
But for Christakis, fast paced and rapid sequencing appears to be the real problem for young children when it comes to the negative effects of media. The findings from one of his studies revealed that children who viewed educational programs and programs that are naturally paced, such as Mr. Roger’s Neighborhood, showed no increased risk for ADHD. In striking contrast, children in the study who viewed fast paced entertainment were 60 percent more likely to have later attentional problems, and children in the study who viewed violent material were 100 percent more likely to develop later attention problems.
For Christakis, it’s the prolonged exposure to rapid image change during the critical period of brain development that then leads to preconditioning the mind to expect higher levels of stimulation, which then leads to short-term, and possibly long-term, attention deficits. He has fittingly termed this process, “the overstimulation hypothesis.”
As parents of young children know, the American Academy of Pediatrics (AAP) is now recommending that media and screens should be avoided for infants 2 years of age and younger and children older than 2 years should have limited exposure to screen time. The AAP is going off of Christakis’s research and the research of several others in the field whose findings have shown that fast paced viewing can over tax the infant and toddler’s brain in making this recommendation.
There is no clear cause for ADHD, which makes the condition somewhat of an enigma. There is no medical test or brain scan that identifies ADHD and there is also a degree of subjectivity and arguably cultural influence in determining the diagnosis. However, certain things are known to play a role in ADHD. Heredity and genetics, problems during pregnancy (e.g., maternal smoking or drinking), exposure to toxic substances (e.g., lead), neurochemistry, and brain injuries, traumas tumors and strokes.
Okay, okay, so “Star Wars” didn’t cause ADHD. But it’s impossible to ignore the relationship between technology and the increase in the condition’s diagnosis and treatment over the past 40 years. For those of us who remember seeing the original in the theaters, the recent release of “Star Wars: The Force Awakens” is a not only a nostalgic experience but a bittersweet one. It harks back to our days of riding a bike all day, unsupervised, and in real time and without a smart phone or gaming system or gadget of some sort to interfere. And that’s suddenly a galaxy far, far away.
Michael Oberschneider, is the founder and director of Ashburn Psychological and Psychiatric Services and a child psychologist. Oberschneider’s new children’s book, Ollie Outside, which deals with screen time management, is slated to be released with Free Spirt Press this summer.
You might also be interested in: