When the East Coast earthquake struck in late August, it appeared to hit particularly hard a place called “Twitter.”
“U.S. East Coast earthquake generated more Tweets than Osama bin Laden death,” the United Kingdom’s Telegraph reported.
“Earthquake Hits East Coast: Aftershocks Felt on Twitter,” said TheWrap.com, a Los Angeles-based publication.
The Wall Street Journal helpfully chimed in with “Earthquake on the East Coast: The Reaction on Twitter,” composed of earthquake tweets from experts such as Ice-T, Rihanna and Snooki. Bethenny Frankel, we learned, was in the middle of lunch!
Where is this Twitter? Was it anywhere close to the earthquake’s epicenter in Mineral, Va.? The people who live in Twitter — do you think that they often consort with the people who live in Facebook, another strange and wonderful land that often appears in the news?
Media types love social networking. Love using it, love finding sources with it, love analyzing it, love writing about it, love love love. It’s a way of demonstrating how much we “get it.” Except it can also demonstrate that we don’t.
You’ve read these stories. There were stories when teachers started using Facebook, stories when coaches started using Facebook, stories when congressmen and judges and authors started using Facebook. There were stories on mothers using Facebook, then grandmothers. There were stories on every demographic using Facebook, until, finally, everyone was there and someone left to join something new.
Then there were stories on how everybody was on Twitter.
“I bet that what’s happening is that editors of a certain age are starting to discover it and are getting a little amazed by it,” says Mike Hoyt, the executive editor of the Columbia Journalism Review.
In the past few years, online social networks have, indeed, allowed us to witness amazing behavior. People have found biological parents on Facebook. People have discovered cheating spouses on Facebook. People have been cyberbullied on Facebook.
(Disclosure: I have written dozens of articles about Facebook and Twitter. Some of them were good, I think, and some weren’t. Irony: A reporter who has written lots about social networks writes another article about overexposure to Facebook.)
Facebook and Twitter have become, depending on the theme of the article in question, either beacons of light or harbingers of doom — revealing how profoundly the world has changed and how, because of social networks, things are happening that never happened before.
Except that people cheated on spouses before Facebook. And people found birth parents before Facebook, too. Bullying, social isolation and teenage heartbreak are not made sadder by the fact that they now exist online as well as in the corners of middle school locker rooms.
“The moral panic about teenagers and technology is an ongoing frustration for me,” says social media researcher Danah Boyd, whose area of expertise is the way young people use the Internet.
Boyd recalls one story about a teenage girl who was charged with murdering her mother. “The [television] headlines were, ‘Girl on MySpace Kills Mother,’ ” she says. “But what’s heartbreaking was that for a year and a half she’d been documenting how her mother was abusive. It’s sad that ‘Abused Girl Kills Mother’ is not a story, but ‘Girl on MySpace Kills Mother’ is.” (An actual print headline from this case: “Murder, They Blogged.”)
What happened in that instance is what frequently happens in news stories about technology, Boyd says. “Technology becomes the point of focus, even when it’s not the most salient point. But we focus on it rather than really trying to understand what’s at play.”
There are revelatory stories about Facebook and Twitter — stories that examine how social networks are shaping or reconstructing our concepts of modern life. But sometimes, social media become a new slipcover for an old couch — a way to dress up stories that are otherwise sagging and tired. Facebook and other sites may have their own rules, mores and cultures, but as often as they reveal something unexpected about the human condition, they reveal what’s always been there.
What mattered in the MySpace story was that a troubled girl took unspeakable action, not that she had an online profile.
What mattered in Egypt was the revolution, not that revolutionaries first learned to go to Tahrir Square by checking Twitter.
What matters in stories about hooking up with old flames on Facebook is the deep yearning people apparently have to reconnect with their 17-year-old selves — not that their ex-boyfriends have “It’s Complicated” relationship statuses.
What matters about how Frankel reacted to the earthquake? Nothing. Nothing about that matters at all.
“Techno-narcissism,” Siva Vaidhyanathan says knowingly. Vaidhyanathan is a former reporter who is chairman of the department of media studies at the University of Virginia.
“It’s a particularly acute phenomenon with people who are technologically blessed. People love to tell ourselves that the things we do six to eight hours a day matter.”
A lot of people spend a lot of time online — at their computers, at desks. It helps explain why these stories get written and why they get read. We all want to think that our lives mean something.
But there’s less to social networking than meets the eye. It offers an appealing and immediate sense of intimacy — showing what people are doing and whom they are doing it with — but the intimacy is often illusory. Twitter feeds and social networking profiles are carefully constructed performances. Life there is restricted to 140 characters, or to scrupulously curated status updates that reveal more about who posters wish they were than who they are.
“The real question here,” Vaidhyanathan says, “is, how much can you really understand a person — with all of their complicated interior life — merely through his or her electronic expressions?”
And how much can you understand a society based on its tweets?
In 2011, the Pew Internet and American Life Project reported, about 13 percent of adults who were online were users of Twitter. Those users were disproportionately young, disproportionately urban and disproportionately educated. Stories about things that happen on Twitter — all the Twitterati sounding off about East Coast earthquakes and Osama bin Laden’s death — represent the reactions of a relatively small subset of the population. When journalists comb Twitter for sources or story ideas, it’s the equivalent of combing America’s bangs and leaving the rest of the head untouched.
“Wolf Blitzer reading a Twitter stream on television,” deadpans Adam Penenberg. “How meta can you get?” Penenberg is a journalism professor at New York University. (He also, famously, was the online reporter for Forbes who discovered that the New Republic’s Stephen Glass was making up his stories.) “Wolf doesn’t do that,” says CNN’s Washington Bureau Chief Sam Feist, when told about Penenberg’s “meta” comment. Other CNN reporters seem to — Jon Stewart lambasted reporters from one show for repeatedly quoting a Twitter user named LadyBigMac.
Whenever Penenberg sees gee-whiz articles relying or focusing on social networks, he thinks that it’s “all about trying to sound so hip and cool — and there’s nothing more pathetic than middle-aged people trying to sound hip and cool.” And “whenever anyone reads Twitter feeds on television,” Penenberg says, “I roll my eyeballs so far in the back of my head I’m afraid they won’t come out the other end.”
In the latter part of the 19th century, The Washington Post published a rash of stories about events that would now seem unremarkable. There were two stories about people listening to music. There was a story about a man learning that he was the father of a child, only to realize that he had been mistaken for someone else, and another in which a woman mistakenly believed that her chatting partner was making lewd remarks about young women, when really he was describing a movie poster.
In 1877, Alexander Graham Bell had applied for a patent.
These stories were written because these events unfolded on the telephone.