The Washington PostDemocracy Dies in Darkness

FaceApp went viral with age-defying photos. Now Democratic leaders are warning campaigns to delete the Russian-created app ‘immediately’

Sean Bradley used FaceApp to make himself look older, one feature of the AI photo editing app that's trending across social media. (Sean Bradley)

The Democratic National Committee on Wednesday warned presidential campaigns against using the viral face-transforming FaceApp, citing the software’s Russian developers. It urged campaign staff to “delete the app immediately.”

The app allows users to upload photos of their faces and have them automatically edited to look like their future selves, replete with wrinkles and graying hair — a popular trick that filled the social media feeds of millions of users, including celebrities such as Drake, LeBron James and the Jonas Brothers.

But concerns over how the photos could be misused by the company, whose developers are headquartered in St. Petersburg, raised alarms among many users as well as DNC officials, who urged 2020 campaign staff and “people in the Democratic ecosystem” not to use the app.

The Post's Geoffrey Fowler has five questions you ought to ask about any app or service, including FaceApp, that wants something as personal as your face. (Video: The Washington Post)

“This novelty is not without risk: FaceApp was developed by Russians,” DNC security chief Bob Lord wrote in the alert to campaigns, which was first reported by CNN. “It’s not clear at this point what the privacy risks are, but what is clear is that the benefits of avoiding the app outweigh the risks. ... If you or any of your staff have already used the app, we recommend that they delete the app immediately.”

Schumer calls for investigation into FaceApp over security concerns and Russia ties

View this post on Instagram

Best caption wins ovo tickets

A post shared by champagnepapi (@champagnepapi) on

View this post on Instagram

Me doing a demo on #MasterChef Season 50....

A post shared by Gordon Ramsay (@gordongram) on

View this post on Instagram

When you take a trip to the Year 3000.

A post shared by Jonas Brothers (@jonasbrothers) on

FaceApp has altered photos for more than 80 million users since its 2017 release and allows smartphone users to change a facial photo’s age, gender or hairstyle, often with convincing results. The app uses artificial-intelligence software to automatically alter the photos in seconds, much like similar features offered by Instagram and Snapchat.

The app is owned by the St. Petersburg-based Wireless Lab, though it has set the state or federal courts in Santa Clara County, Calif., in the heart of Silicon Valley, as the jurisdiction for the settlement of any legal disputes, according to its terms of service.

Opinion: Here's what the FaceApp freak out really tells us

Founder and chief executive Yaroslav Goncharov told The Washington Post that FaceApp’s research-and-development team is based in Russia but that no user data is transferred into the country, and “most images” are deleted from company servers within 48 hours.

DNC officials were targeted by Russian hackers during the 2016 race and have invested heavily in cybersecurity measures to prevent a similar attack.

You downloaded FaceApp. Here’s what you’ve just done to your privacy.

The app uploads people’s photos to the “cloud” of servers run by Amazon and Google, the company said, meaning deleting the app would likely make no difference on how the photos are used. In its privacy terms, the company said it can collect any of a user’s uploaded photos as well as data on the user’s visited websites and other information.

The app’s terms of service say users grant the company a “perpetual, irrevocable . . . [and] worldwide” license to use a user’s photos, name or likeness in practically any way it sees fit.

If a user deletes content from the app, FaceApp can still store and use it, the terms say. FaceApp also says it can’t guarantee that users’ data or information is secure and that the company can share user information with other companies and third-party advertisers, which aren’t disclosed in the privacy terms.

Goncharov said that users who want to remove their data from FaceApp can make the request through the app by clicking “Settings,” then “Support,” then “Report a bug” with “privacy” in the subject line. “Our support team is currently overloaded, but these requests have our priority,” a company statement read.

FaceApp’s terms of service say it can share information with a government agency if a subpoena, court order or search warrant is issued and the company has “a good faith belief that the law requires” it to do so. This information can also be shared with any country that FaceApp maintains facilities in, including Russia.

People who use the app also “consent to the processing, transfer and storage of information about you in and to the United States and other countries, where you may not have the same rights and protections as you do under local law.”

Everything that’s wrong with FaceApp, the latest creepy photo app for your face

Baptiste Robert, a French security researcher who uses the pseudonym Elliot Alderson, said he looked into the traffic between FaceApp on his phone and the Internet to understand how the network operates for users.

He found that only photos that are uploaded and modified are saved to the server, not the user’s entire camera roll. But he also said he didn’t think the app was compliant with the European Union’s new General Data Protection Regulation (GDPR).

“When you upload your photo, you have no idea how your photo is used,” Robert said, noting that the app’s terms and conditions are vague. “Don’t rush to use this application because you don’t know how your data is used after that.”

Kate O’Neill, a tech consultant, said FaceApp’s privacy terms are still murky, despite the company’s clarification.

“People should be savvy about when apps and memes and games are encouraging everyone to engage in the same way,” she said. “It puts the data in a vulnerable state that becomes something that can train facial recognition and other kinds of systems that may not be intended the way people are using it.”