Child sexual abuse material is a clear scourge on the Internet — but almost no one can agree on what to do about it. Lawmakers this month advanced the Earn It Act to the Senate, and with it stirred up a whole lot of controversy.
According to privacy and speech advocates, issues abound. They fear that the bill will encourage companies to abandon end-to-end encryption out of concern that judges will hold their inability to search for certain content against them; they might also over-censor and in doing so squelch legitimate free expression, including by sexual minorities. These advocates also worry this supposed solution could worsen the problem. Widespread searching for CSAM on the mainstream Internet might drive bad actors deeper into the Dark Web. By effectively compelling companies to comb their systems for CSAM, skeptics say, the bill could unintentionally make them into government actors in the eyes of the law, allowing defendants to argue under the Fourth Amendment that they were the subjects of unconstitutional searches.
These objections have some merit — the encryption conundrum foremost among them. To scan for CSAM everywhere, firms would have to build a back door into their systems that could be exploited by cybercriminals or prying regimes abroad with little regard for civil liberties. Lawmakers shouldn’t risk such a momentous change occurring as a byproduct of a separate legislative aim, no matter how noble. At the very least, the language related to encryption needs modifying to reduce this risk. But legislators should also explore other approaches to scrubbing out CSAM that haven’t been tried: from increasing resources for enforcement to mandating that companies scan the metadata of communications rather than their contents.
More modest steps, admittedly, would produce more modest results — perhaps an unsatisfactory resolution to a grotesque problem. Yet further study of how this digital plague of abuse might be prevented without inviting new harms is essential.