Can we really trust our privacy to Facebook?
In his May 24 op-ed, "A new page in Facebook privacy," Mark Zuckerberg laid out an appealing vision where sharing information leads to greater openness and a better world. If only it were that simple.
The truth is that many repressive governments take advantage of Facebook's open platform to identify "enemies of the state," such as individuals who are "fans" of a particular politician, or to track activists they don't like by identifying who is a "friend" of government critics outside their borders. This is possible because Facebook's policies make the pages that users like public by default. Here at home, Facebook's policies have permitted sharing of users' data with third parties without their knowledge, raising serious concerns about users' control over how their data are shared with others.
That's why tweaking Facebook's privacy policies to make them more user-friendly -- while welcome -- misses the more fundamental point. So long as Facebook bases its privacy policies on the belief that privacy is an outdated notion being replaced by a principle of "openness," it will continue to put its users at risk and will play a dangerous role in degrading a fundamental human right that, in many countries, has life or death consequences.
Elisa Massimino, Washington
The writer is president and chief executive of Human Rights First.
Mark Zuckerberg's defense of Facebook sounds good, but what was once an unencumbered dorm-room project is now subject to the competing pressures of privacy and economic reality. Facebook makes money from advertisers.
Advertisers want the best targeting and access possible. Targeting is possible only by collecting and using personal information. Maybe information will be kept private, but there have been enough examples of "oops" to make one cautious. Why should anyone trust Facebook with personal information?
Bruce Jamison, Alamo, Calif.