Ctrl + N
Should Facebook take down a doctored video of Nancy Pelosi? Ban a conspiracy theorist like Alex Jones?
These are the kind of content moderation quandaries that have been vexing the world's largest social network, and after years of controversies and missteps, the company says it can't make these decisions alone. That's why Facebook has been building a “Supreme Court” of independent experts that would weigh in on the company’s toughest content moderation decisions — and it’s hope is that it will one day govern decisions across Silicon Valley.
“It's just going to impact our platforms, but the hope absolutely is that at some point this is going to be an industry-wide body,” said Facebook public policy manager Shaarik Zafar at a panel on free expression online yesterday at the New America Foundation. “At that point you would have some type of consistency across platforms.”
Facebook’s plans highlight that without government regulation, technology companies have been left to develop their own solutions to policing troublesome content. Critics say the companies waited too long to address many issues related to harmful content, and the independent oversight board is one way that Facebook is going on the offensive after broad public backlash.
The plans for a board are just one of many steps Facebook is taking to show that it cannot confront all of its challenges alone. Facebook chief executive Mark Zuckerberg has called on global regulators to help the company address harmful content issues, but the company says it can’t wait for politicians to take action.
“We recognize there's a role for regulation, absolutely,” Zafar said. “But we don't want to wait for the perfect regulation.”
Facebook is building this board as tech companies’ content moderation decisions are drawing more scrutiny — both from policymakers in the U.S. and abroad. While countries like Australia and Germany have passed laws requiring companies remove certain types of hate speech or harmful content, experts say it’s unlikely that U.S. lawmakers would pursue similar regulations due to the First Amendment concerns.
Francella Ochillo, a digital rights advocate and executive director of the nonprofit Next Century Cities, says that industry leaders — not policymakers — are going to have to develop best practices for themselves.
“Truthfully by the time everyone gets their hands around the issue of the day, we’ve already moved on to a new issue,” Ochillo said on the same panel. “Right now I don’t think the government has the expertise or capacity.”
She also said she didn’t know that government regulation would help address online free expression problems.
That’s why creating an industry-led body might be one solution. Facebook says the oversight board will operate with complete independence, and Zafar said the company plans to follow the recommendations of the board even when it disagrees with them.
But Facebook's promises have been met with skepticism from critics who see the board as a public relations move or a means for Facebook to shirk responsibility from making tough decisions about content it hosts.
Facebook recently asked other tech policy experts to weigh in on its plans. One such expert was Sharon Bradford Franklin, the director of surveillance and cybersecurity policy at the New America Foundation's Open Technology Institute. She said the organization is waiting for clarification on how much recommendation authority the board will have.
Bradford Franklin said the organization is encouraging Facebook to allow the board to weigh in on the development of its community standards, and it is also encouraging Facebook to think about diversity in creating the board.
BITS: As many as 4 million people have been leaking personal and corporate secrets through Chrome and Firefox, according to a new investigation from my colleague Geoffrey Fowler. Google and Mozilla shut down the leaks immediately after Geoffrey notified them, but it's probably only the tip of the iceberg.
Downloading legitimate browser extensions — also known as add-ons or plug-ins — can quietly trick users into forking over access to their data without a second thought. Once the software is added, it can have full access to what web pages you visit, what you click on, and sometimes even your passwords and location. Some of these extensions then cash in by selling the data.
While many companies claim to anonymize the data, Geoffrey found the extensions make a host of personal information easily available on the web. Geoffrey was able to find medical records, flight confirmation numbers and the associated passenger names, and private tax documents.
“Employees from more than 50 major corporations were exposing what they were working on (including top-secret stuff) in the titles of memos and project reports,” Geoffrey found. “There was even information about internal corporate networks and firewall codes.”
The problem goes well beyond the apps discovered by Geoffrey: Researchers at North Carolina State University found that at least 3,800 of the 180,000 extensions available on Chrome leak user data of potentially hundreds of millions of users.
NIBBLES: President Trump wants to launch an investigation into the highly contested bidding process for a $10 billion Pentagon cloud computing contract, he told reporters yesterday. Trump said he has received complaints from “companies like Microsoft, Oracle, and IBM” about the process, my colleagues Aaron Gregg and Jay Greene report.
“They’re saying it wasn’t competitively bid,” Trump said. “Some of the greatest companies in the world are complaining about it, having to do with Amazon and the Department of Defense, and I will be asking them to look at it very closely to see what’s going on.” Both IBM and Oracle, which recently lost a lawsuit contesting the bid process, have been vocal in alleging that Pentagon officials favored Amazon in creating the project. Microsoft, which is the only other finalist for the contract, has not previously raised concerns publicly.
An internal investigation by the Pentagon cleared the bid process of any wrongdoing, and an intervention by the White House into the process would be “unusual,” Aaron and Jay report. It could also further set back the awarding of the contract, which has been delayed by investigations and lawsuits for over a year.
BYTES: The Federal Trade Commission’s reevaluation of a law protecting children’s privacy could lead to tougher enforcement against apps and websites that illegally collect data on children, my colleagues Tony Romm and Craig Timberg report. “Among the FTC’s concerns are websites, video games and other services that are not explicitly marketed for children but still attract large numbers of young people anyway,” they write.
The law, established under the 1998 Children’s Online Privacy Protection Act, punishes only companies that knowingly target underage users, making it easy for companies to plead ignorance, advocates say. The FTC has already started taking the concerns that the law isn’t working more seriously, recently launching an investigation into complaints by advocates that YouTube was illegally collecting the data of children under 13 and targeting them with ads. The agency fined Musical.ly (now TikTok) a record $5.7 million for violating the law in February.
But some argue that the bigger problem is the law itself, not just enforcement. Sens. Ed Markey (D-Mass.) and Josh Hawley (R-Mo.) are proposing a bill to update the law so that companies with “demonstrably large numbers of underaged users” are on the hook regardless, and the age of protection would be raised to 15.
— News from the private sector:
— News from the public sector:
— Tech news generating buzz around the Web: