(Sarah Hashemi/The Washington Post)

An app developer who created an algorithm that can digitally undress women in photos has pulled the plug on the software after high traffic and a viral backlash convinced him that the world is not ready for it.

DeepNude used artificial intelligence to create the “deepfake” images, presenting realistic approximations of what a woman — it was not designed to work on men — might look like without her clothes. Deepfake photos and videos often appear credible to the average viewer, prompting concerns by researchers and lawmakers about their potential to mislead the public, especially in the run-up to the 2020 election.

Last month, a doctored clip of House Speaker Nancy Pelosi (D-Calif.) that had been altered to make her slur her words went viral, drawing attention to how even poorly made videos can be used to spread political disinformation at alarming speeds.

“Deep-fake technologies will enable the creation of highly realistic and difficult to debunk fake audio and video content,” Danielle Citron, a law professor at the University of Maryland, testified before a House committee on the dangers of deepfakes this month. “Soon, it will be easy to depict someone doing or saying something that person never did or said. Soon, it will be hard to debunk digital impersonations in time to prevent significant damage.”

Though much has been made of the technology’s threat to national security, it has also been harnessed to make a torrent of fake porn, including widely circulated videos of celebrities such as Gal Gadot and Scarlett Johansson. Although sites including Reddit, Twitter and Pornhub have tried to ban pornographic deepfakes, they have had limited success. The technology is cheap and easily accessible, and the opportunities for use are limitless.

The free version of DeepNude placed a large watermark on images it generated. The $50 version, however, just slapped a small stamp that reads “FAKE” in the upper-left corner of the pictures. As the online magazine Motherboard noted, it could be easily cropped out.

When Motherboard first reported on the app on Thursday, its creator, a programmer who goes by “Alberto,” insisted he was “not a voyeur,” merely a technology enthusiast who was driven to create the app out of “fun and enthusiasm.”

“Also due to previous failures (other start-ups) and economic problems, I asked myself if I could have an economic return from this algorithm,” the programmer told Motherboard. “That’s why I created DeepNude.”

The app, which was available for Windows and Linux, was based on an open-source algorithm from the University of California at Berkeley, Alberto told Motherboard. DeepNude was taught to create convincing nudes using 10,000 naked images.

DeepNude’s creator said he mulled the ethics of his software but ultimately decided the same results could be accomplished through any number of photo-editing programs.

“If someone has bad intentions, having DeepNude doesn’t change much. . . . If I don’t do it, someone else will do it in a year,” Alberto said.

Soon after Motherboard’s report, traffic caused the server to crash. Late Thursday, after further coverage and outrage on social media, Alberto took to Twitter to announce DeepNude’s end, saying the chances of people abusing the app were too high.

“We don’t want to make money this way,” the tweet read. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it.”

“The world is not yet ready for DeepNude,” it said.

Pornographic deepfake images don’t technically count as revenge porn because they aren’t actual images of real women’s bodies, but they are still capable of causing psychological damage. California is considering a bill that would make pornographic deepfakes illegal, making it the only state to date to take legislative action against them.