Ctrl + N
Facebook is staunchly defending its plans to forge ahead with strong encryption, even as senators from both parties escalate warnings that the added security could make it harder to spot and stop child abuse.
Privacy executive Jay Sullivan told the Senate Judiciary Committee that Facebook is working on techniques that will detect instances of child exploitation, even with encryption so tough the company will not be able to see the contents of messages.
Sullivan signaled that Facebook could take lessons from its subsidiary WhatsApp, which already uses this kind of encryption and currently removes about 250,000 accounts each month for child abuse.
“Facebook invests immense resources on safety, and there’s a lot that we can and will do to keep people safe while providing secure, private communications,” Sullivan said.
The company is expanding the tools users have to flag problematic messages, he said. These user reports to the company are not encrypted, so Facebook can act on them and share them with law enforcement following a valid request, Sullivan said. The company is also committing to scouring other data, such as profile pictures, group names and information shared across its services to crack down on abuse.
Facebook's plans have created fresh urgency in the encryption debate, especially as the Justice Department shifts its focus to how the technology can shield child abusers and sex traffickers, rather than how it benefits terrorists plotting operations.
Yet the latest testimony may not go far enough to satisfy members of Congress, who now say they are ready to step in and force tech companies to allow law enforcement access to encrypted messages if the companies don't volunteer it.
Facebook Messenger alone was responsible for nearly 12 million of the 18.4 million worldwide reports of sexual abuse material last year, according to the New York Times.
Senators expressed skepticism that Facebook's investments in alternative techniques could keep up with the broad scale of content exploiting children on its platform. "Since most of the reports are based on content, I don't know how you're not going to see a pretty significant decrease in the number of reports that you're able to make," Sen. Mazie Hirono (D-Hawaii) warned at the hearing.
She said Facebook's efforts to identify abuse outside the messages themselves "seems very daunting considering the billions of messages that are being put out there."
Hirono additionally asked if Facebook would conduct an analysis of its ability to report sexual abuse material as it rolls out encryption. She also asked if Facebook would consider dropping its encryption plans if the reports substantially decreased. Sullivan did not commit to either during the hearing.
Facebook is making the case that the expansion of strong encryption is about protecting users' privacy and security after its data collection practices have been under intense scrutiny in Washington. Sullivan warned lawmakers that if American companies don't lead in this area, consumers will seek out alternative services from other countries.
"We think it is critical that American companies lead in the area of secure, encrypted messaging," Sullivan said. "If the United states rolls back its support for privacy and encryption, foreign application providers will fill the vacuum. These firms will be largely out of the reach of U.S. law enforcement and will not be as committed to or capable of preventing, detecting and responding to bad actors.”
Lawmaker scrutiny of the company's defenses intensified yesterday as it sent a letter rebuffing a request from Attorney General William P. Barr to halt its encryption expansion, my colleague Tony Romm reports. The company called the U.S. government’s pursuit of a “backdoor” into secure communications a “gift to criminals, hackers and repressive regimes.”
But lawmakers are still escalating the pressure on the companies, my colleague Joseph Marks notes in today's Cybersecurity 202.
“I think all of us want devices that protect our privacy,” said GOP Sen. Lindsey O. Graham (S.C.), the chairman of the Senate Judiciary Committee, as he opened the hearing. “Having said that, no American should want a device that becomes a safe haven for criminality.”
BITS, NIBBLES AND BYTES
BITS: The Justice Department is exploring changes to a decades-old law that protects tech companies from liability for content posted by users, Attorney General William P. Barr told a group of state attorneys general yesterday, my colleague Tony Romm reports. The remarks signal that the department is wading into an ongoing Washington debate over how tech companies moderate their content, in addition to a wide range of other competition concerns the agency is probing.
Barr cited conservative criticism that tech companies unfairly remove “third party speech, including political speech selectively and with immunity.” The protections outlined in Section 230 of the Communications Decency Act have “expanded far beyond what Congress originally intended,” Barr said. The agency is “thinking critically on the issue,” he added.
Barr didn’t mention companies such as Facebook and Google by name, but the high-profile warning comes as tech companies are also being probed by state attorneys general and the Federal Trade Commission.
“We’ve heard widespread concerns from consumers, businesses, [and] entrepreneurs, including about stagnated innovation, high prices, lack of choice, privacy, transparency and public safety, and in response, DOJ initiated a review into market-leading online platforms,” Barr said.
NIBBLES: House Speaker Nancy Pelosi called a legal shield included in the North American trade deal a “real gift to Big Tech.” The California Democrat said the language similar to Section 230 included in the deal was her “one disappointment,” the Hill's Chris Mills Rodrigo reports.
Pelosi (D-Calif.) acknowledged at a news conference yesterday that her 11th-hour push to keep the language out of the deal was “too late.” Last week, she announced her opposition to including the shield in the U.S.-Mexico-Canada Agreement amid concerns about exporting the language that is at the center of a heated debate in Washington.
Advocates for Section 230 changes have found a silver lining. Pelosi's effort put an increased spotlight on the inclusion of liability protections for tech companies in trade deals, and it could make it more difficult for the tech industry to lobby for such protections in the future. They also don't think the USMCA deal will hinder future efforts to change Section 230 domestically, especially in light of the defenses the tech industry launched in recent days to ensure the provision was included.
“It is unfortunate that the Sect. 230 language was not taken out,” Rick Lane, a longtime technology policy adviser who has supported overhauling Section 230 told me. “But at least for those of us who believe that changes to CDA Sect. 230 are necessary can take solace in knowing that organizations like CTA and NetChoice have stated unequivocally that inclusion of Section 230 language in trade agreements does not stop the [United States] from changing the law in the future should it choose to do so.”
But tech industry trade groups took a victory lap yesterday.
“Today’s announcement is welcome news to the software industry, and businesses of all sizes across the economy who rely on data to succeed and grow,” said Craig Albright, vice president of legislative strategy at BSA | The Software Alliance. “The software that powers our daily lives depends on the rapid and seamless movement of data across borders. The USMCA establishes a gold standard for rules on digital trade. Congressional action on the USMCA is critical to securing American leadership in the digital economy into the future.”
BYTES: A bipartisan coalition of members from both chambers of Congress introduced legislation that would require companies to retain information about illegal photos and videos on their platforms for a longer period of time in the hopes of giving lawmakers more time to gather evidence of crimes, the New York Times's Michael H. Keller reports. Lawmakers say the bill comes in response to a New York Times investigation revealing that data deletion often led to child abuse cases going cold.
“This bill gives our law enforcement more time to work with data being gathered by technology companies to better protect our children and keep our communities safe,” said Rep. Anthony Gonzalez (R-Ohio), who sponsored the bill in the House.
Currently, companies are only required to retain information on images depicting abuse for 90 days. But lawmakers argue that is “often not enough for habitually under-resourced law enforcement to conduct the necessary investigative process.” The END Child Exploitation Act doubles that time and ensures companies are legally able to retain the material longer if needed.
“Crimes that once occurred solely in the physical space are now dominating the virtual world. Technology companies that are a hub for youth social interactions should recognize the need to assist law enforcement in their information-gathering efforts,” said Sen. Marsha Blackburn (R-Tenn.), who introduced the bill in the Senate alongside Democrat Catherine Cortez Masto.
— News from the public sector:
— News from the private sector:
RANT AND RAVE
Elon Musk's mother brought us a #TBT from long before the Tesla founder's cybertruck days. Even the moms of billionaires can be embarrassing.
- Intel has appointed Jeff Rittener as Chief Government Affairs Officer and General Manager of Intel’s Governments, Markets and Trade group.
A video explainer on the technology that’s changing the meaning of the human face from Recode: