Sunday, February 8, 2009 5:08 PM
Facebook just turned 5 years old. But a week that should have been filled with reflection and good times was instead marred by a series of breaking news reports detailing sex scandals, phishing, and other malicious activity on the world's largest social network.
In his blog post announcing the 5-year milestone, founder and CEO Mark Zuckerberg wrote that "Facebook has offered a safe and trusted environment for people to interact online, which has made millions of people comfortable expressing more about themselves." But is Facebook really as safe as everyone seems to think?
It's Been A Long Week
On Tuesday, February 3rd, we reported that thousands of sex offenders (many of whom were previously booted from MySpace) were lurking on Facebook (they've since been removed). As CNET's Caroline McCarthy pointed out, these might not have necessarily been MySpace 'refugees' in the sense that they migrated en masse from MySpace to Facebook - they likely maintain profiles on multiple social networks. But the fact remains that there were thousands of convicted sexual offenders on a social network that is generally perceived as safe.
On Wednesday, news broke of an elaborate and disturbing sex ring involving at least 31 high school students. An 18 year-old man named Anthony Stancl has allegedly been masquerading as high-school girls on Facebook, flirting with underaged male classmates and convincing them to send him nude photographs. He would then use the photographs to blackmail the boys into performing sexual acts with him, which he took pictures of using a cell phone. Stancl has been charged with 12 felony counts and up to 300 years of jail time. (In a somewhat bizarre twist, Facebook responded to news of the sex ring by stating that fewer than 1% of its 150 million users are affected by impersonation schemes. So, around 1.5 million people. Not exactly a confidence-inspiring statistic.)
The same day, Facebook updated its Terms of Service, rewording many of its rules to make them easier to understand and explicitly prohibiting some common transgressions, like including false information in profiles or creating fake accounts. But there was one far more timely addition: "If you are required to register as a sex offender in any jurisdiction, you may not use the Facebook Service." Facebook spokesman Barry Schnitt says that sex offenders had previously been banned through a number of other more general statements in the Terms of Service, but that the company wanted to make it more explicit.
On Friday, CNN reported on an increasing number of phishing attacks seen on Facebook, using a technique we first heard about in January. After gaining access to compromised accounts, scammers are now using Facebook to ask the victims' "friends" for cash. The attacks can be particularly effective because the scammers can easily look up personal details of the people they're contacting.
Finally, Maryland banned both Facebook and MySpace from its General Assembly Computers, as they had been the primary sources of numerous malware attacks (though we should note that the rumored ban of Facebook in Apple stores was overblown).
Had each of these stories broken on their own, they probably would have been met with little more than raised eyebrows. After all, with over 150 million users, it's inevitable that some bad things are going to happen (and they have before). But taken together, it's clear that Facebook isn't quite the safe haven we might perceive.
How We Got Here
Since launching in 2004, Facebook has benefited from its public perception as a safe, clean site - especially compared to its biggest competitor, MySpace. Whereas MySpace allows users to customize their profile pages with graphics and audio (sometimes to the point of making them obnoxious), Facebook has maintained a more pristine environment, which certainly helps bestow a feeling of safety.
Facebook is also theoretically more secure. When it first launched, only users with valid university (.edu) Email addresses could sign up. Over the years the site expanded to allow high school students, and eventually opened up to everyone. But each group of students or coworkers is still segmented into different 'networks' - you can't browse through anyone's profile unless you belong to their university or company network, usually verified through Email. These roadblocks add up to make creating fake profiles more of a challenge, but as we've seen in the last week, they can be overcome.
Perhaps most important to note is Facebook's relatively good security record up until this point. Parry Aftab, an independent online security expert who heads WiredSafety, says that there have been fewer sexual predator attacks on Facebook than its competitors and that her studies have found its security measures to exceed those seen elsewhere. She also notes that in general, users have behaved better on Facebook, and that teenagers have reported that they "feel safer" on the site.
But Aftab says that given how quickly Facebook has grown - it jumped from 100 million users last August to over 150 million users today - she isn't surprised that some registered sex offenders slipped through the cracks. In her words, "if you have 150 million users, you're going to have all kinds of bad people".
So what measures can Facebook take to maintain its wholesome image?
What Needs To Change
Last May, Facebook announced that it had forged a deal with Attorneys General from 49 states to implement new safety and privacy rules (MySpace had adopted similar measures a few months earlier). Among the new policies were agreements to "aggressively remove inappropriate images and content" and to "more prominently display safety tips".
At the time we noted that this was probably a tough measure for Facebook to swallow - such initiatives can be very costly in terms of manpower, especially when it comes to moderating content. And frankly it looks like Facebook hasn't really lived up to its promise. For starters, MySpace has a pair of human eyes looking at every photo uploaded to the site. Facebook doesn't - instead, it relies on users to flag any content they find inappropriate. Aftab says that this system is effective, but I don't regard it as "aggressive" - I'd much rather hear that Facebook employs a dedicated team to scan through photos, even if only for those shared by minors (or even better, a combination of flagging and human scanning).
This events of this week, and the sex ring case in particular, will likely be a wakeup call for Facebook, akin to MySpace's tragic suicide case a few years ago. As it continues its rapid growth, Facebook needs to step up both its technological and manpower efforts to more effectively deter malicious behavior. And Facebook's Chief Privacy Officer Chris Kelly, who plans to run for Attorney General of California, can't afford to let these issues fall to his successor.
But the reality is that no matter what these social networks do, they'll never have the technology or the manpower to stop every threat. Which is why they need to stop pretending that they're safe. Facebook's (and MySpace's) goal is to connect as many people as possible, and the sad truth is that many people are very naive when it comes to online safety. These social networks need to step up their education and awareness efforts, perhaps even offering a 'safe mode' for users (even adults) who aren't adept at navigating the web's pitfalls. Because sharing is only fun until someone gets hurt.