Sara M. Watson is a technology critic and an affiliate at the Berkman Klein Center for Internet and Society at Harvard University.
Both Facebook and Twitter say Kremlin-linked organizations used their platforms to try and influence voters during the 2016 election. Here's how. (The Washington Post)

Have you taken a close look at your ads lately?

Washington and Silicon Valley have been shocked by each new discovery recently of how Russian operatives bought ads on Google, Facebook, Twitter and other social media networks during the 2016 campaign. Kremlin-linked troll farms bought cheap advertising with a very wide reach, possibly getting their messages in front of millions of American voters.

Such ads are our clearest example yet of the ways that personalization and microtargeting — basics in the business of data on the Internet — can be weaponized.

And it’s not just ads we need to worry about — it’s all forms of personalization. In a world where more and more interfaces are personalized, we need better means to exert our preferences and protect against misuse. Russian interference using standard business practices such as buying ads on Facebook puts into stark relief the lack of oversight of the potential misuses and abuses of technologies that filter our daily lives.

Ads are the best signal we have to show us how our personal data is being used. We’ve begun to see examples of how targeted advertising online has moved from the commercial into the political sphere. ProPublica has reported that Facebook made it possible for advertisers to target racial proxies buying ads for categories like “ethnic affinity groups” and “jew haters.” An Australian Facebook ad team presented leaked research suggesting how emotionally unstable teens might be targeted. The Trump campaign deployed “Super Predator” dark posts targeting black voters to suppress turnout just before the election. But Russia’s digital tactics demonstrate just how far exploitative microtargeting can go.

Today there is no neutral interface, no unfiltered feed. From music recommendations to algorithmically generated news feeds, even search results and front pages of news sources, our digital lives are tailored to match our unique behavioral patterns. We can’t toggle between a neutral experience and a personalized one.

Most of what feeds into microtargeting is based on assumptions — algorithms observe past behavior, process it through a prioritization system and spit back out more of what you will like, or what will get you to spend more of your money or time. At best, these systems anticipate our needs and interests and tailor our experience accordingly. At worst, though, they are deployed to take advantage of us.

It’s hard to find the fuzzy line between appropriate uses and misuses of the technology. What’s the difference between a retargeted ad selling shoes vs. one discussing protection of the Second Amendment?

This is about more than just annoying ads. It’s about our agency to understand and control the interfaces with which we live every day.

Facebook has responded with an action plan to address election integrity issues. Ads targeted to various users will be available on one page for users to compare to what others are seeing. Facebook will also finally make political advertising disclosures more transparent, as regulation of TV and other media already requires. But these efforts do not address the wider influence of microtargeting and personalization on this and other platforms.

Interfaces dealing in user data need to be held accountable to their users. Beyond the ones Facebook has proposed, there are a number of solutions that platforms and regulators can pursue.

First, interfaces need to develop means of expressing the degree to which an experience is personalized to users. This allows platforms to explain the inputs that go into the personalized feed or recommendation, and also gives users a means of interacting with and responding to those inputs or algorithm weights. Imagine telling the Facebook feed that for the next week you want to see more news articles and fewer cat videos. Platforms can test their assumptions against us and provide even better personalized experiences. We need that level of granular control over the personalized interfaces to gain more control and agency over our experiences.

Second, we need to apply some of the lessons from traditional advertising to the Wild West that is online advertising. We need to regulate appropriate and inappropriate uses of microtargeting, with consumer protection principles in mind. Whether updating Federal Election Commission regulations for political advertising standards, or looking to industry self-regulation as was once done for subliminal advertising, further oversight over digital targeting is needed.

Finally, we need to develop more data and better literacy of these practices to develop normative stances on what are acceptable and appropriate forms of targeting. The Russian-bought ads have struck a nerve, but we need more examples of microtargeting out in the open to clearly articulate our appetite for personalized experiences.

Facebook allows users to manage the categories its algorithms have determined you might be interested in, often based on apps you’ve installed or pages you’ve liked or interacted with. We can also ask, “Why Am I Seeing This Ad?” about a particular ad by clicking on the “…” in the top right corner of a post or ad. You might be included on brands’ customer list or could be targeted in a broad category such as “women ages 18 to 34 who live in the United States.” You can also use the same feature to remove certain people or publications from your feed. For now, though, these opportunities to provide input happen through micro-interactions or are buried in Ad Preferences settings.

These approaches won’t solve all problems. We may remove protected categories from ad-targeting drop-down menus, but advanced data services such as Cambridge Analytica will be able to microtarget based on proprietary email lists. Platforms maintain that giving users more control results in information overload and management fatigue, but the option for further insight and control needs to exist for at least the savviest users. For the rest of us, we need mechanisms for oversight to identify instances of misuse and abuse to protect against further exploitation and manipulation.

Subliminal advertising has been regulated away — for good reason. This is the modern equivalent in our digital lives. It’s time we take personalization personally.

Read more:

How to keep your private conversations private for real

The Clinton campaign warned you about Russia. But nobody listened to us.

Donald Trump is normalizing paranoia and conspiracy thinking in American politics