Every recent scandal about today’s technology titans has a familiar bottom line: The sites that have become inescapable features of day-to-day existence know where we live and work, what we buy and think about buying, whom we talk to and about what. They use their deep knowledge of us to make money, insufficiently mindful of any risk. All the while, most of us remain unaware.
It’s not hard to see how this happened. Privacy in the United States runs on a model of “notice and consent,” which is what it sounds like: Companies tell consumers what data they will collect and what they will do with it; consumers agree. But companies often do not tell consumers as much as they ought to, and consumers often cannot parse the disclosures anyway. Even tech-conscious users click automatically past privacy policies because they are long and complicated — and because they see no other option.
The line from many privacy advocates, which tech companies now also endorse, is that Congress must order firms to provide more information and require more approval from users. But this will not solve our problem. Yes, companies should do everything in their power to ensure users are aware of the specific purposes for which specific categories of their information will be processed. But that is not enough.
The free and informed consent that today’s privacy regime imagines simply cannot be achieved. Collection and processing practices are too complicated. No company can reasonably tell a consumer what is really happening to his or her data. No consumer can reasonably understand it. And if companies can continue to have their way with user data as long as they tell users first, consumers will continue to accept the unacceptable: If they want to reap the benefits of these products, this is the price they will have to pay.
But this is not a price consumers should have to pay. It is time for something new. Legislators must establish expectations of companies that go beyond advising consumers that they will be exploiting their personal information. For some data practices, this might call for wholesale prohibition. For all data practices, a more fundamental change is called for: Companies should be expected and required to act reasonably to prevent harm to their clients. They should exercise a duty of care. The burden no longer should rest with the user to avoid getting stepped on by a giant. Instead, the giants should have to watch where they’re walking.
Legislating and enforcing a new mind-set will be hugely challenging. Over the course of this year, we will write from time to time about the hard questions this will raise. But the difficulty of those questions can’t be used as an excuse not to act. It’s time for Congress to help create new norms for a digital age, with the cooperation of the companies themselves. Together they should try, as the founders of these companies did when the Internet was young, to build something that works.