(Rachel Orr/The Washington Post; iStock)

Every new technology comes with accompanying fears about how its use will “change” (read: harm) our brains. But no social network has been as widely derided, demonized or scaremongered as Twitter, the short-form messaging service that turned 10 on Monday.

Time dubbed it “the crack” of Internet addiction a mere year after it launched in March 2006. Susan Greenfield, the controversial British neuroscientist and politician, has repeatedly claimed that the network has an “infantilizing” effect, making adults think — and thus, behave — more like needy, hyperactive children. Twitter has also, according to popular rumor, slashed our attention spans, torpedoed our ability to read long or think deep, bewitched us with false signals of our own social importance and otherwise “rewired” our well-evolved cognitive processes.

Counter to popular perception, however, not a single one of those “harms” has been proved conclusively true. In fact, virtually everything you think you know about how social media affects the brain is based on conjecture.

“There has not been one study that looked at the effects of social media on the brain,” said Dar Meshi, a cognitive neuroscientist at Freie Universität Berlin. “We don’t know anything [about how the brain changes in response to social media].”

What we do know about the brain, at this juncture, doesn’t yet add up to the vision of terror that people such as Greenfield have described. Among the things we do know: The brain is literally always changing as it encounters new information. (In other words, “rewiring” isn’t inherently or necessarily bad.) Also, every brain is different and thus responds differently to certain sorts of stimuli.

In December, Meshi, along with two colleagues, published a well-received review in the journal Trends in Cognitive Sciences that evaluated the current research — fewer than 10 studies, as of publication — on social media and the brain. They found that, while we may be more stimulated than ever, no study has yet demonstrated that social media is “rewiring” our brains in a way that is different or worse than, say, having a conversation or reading an article such as this one. And in cases in which social media does appear to cause bad behavioral effects, it’s unclear whether the medium is to blame or whether there’s some secondary, underlying reason.

Past research has found, for instance, that major changes in the adolescent brain spring foremost from genetics. And Meshi’s past work has found that people with high sensitivity in their left nucleus accumbens, part of the brain’s reward system, are more tuned in to Facebook. It’s not that Facebook has changed or “rotted” their brains — it just made a natural impulse more readily visible.

The opposite is true, as well, Meshi said. Depending on the sorts of mental processes it provokes, Twitter can also theoretically provide all kinds of cognitive benefits. Inspired by an argument I had recently with Emory University’s Mark Bauerlein (the author of “The Dumbest Generation,” a title that really speaks for itself), I asked Meshi whether there’s something profoundly different going on in your brain when you read “The Odyssey” vs. when you read a tweet or a text message.

Sure, Meshi acknowledged, if the text or the tweet is directed at you, it will probably activate the regions of the brain that deal with self-referential cognition. But otherwise, he said, it’s “a very similar neural process.”

“It doesn’t depend on the medium — it depends on the type of cognition that is generated by the tweet or the book,” Meshi said. In other words, when it comes to the mechanics of your brain, the medium is decidedly not the message.

Unfortunately, until more neuroscientists take an interest in this specific field — and until pronouncements about the brain-melting evil of social media stop selling books and sparking clicks — you’re unlikely to hear that Twitter’s anything but a playground for mental adolescents. Let’s hope the conversation has progressed a bit further by Twitter’s 20th anniversary, in 2026.

Liked that? Try these!

Correction: An earlier version of this post misspelled Mark Bauerlein’s name.