Advertisement
This Article is From Jun 28, 2019

DeepNude App, Which Created Nude Photos Of Women, Taken Down By Developer

DeepNude app sparked outrage across the internet for the garish invasion of privacy and objectification of women.

DeepNude App, Which Created Nude Photos Of Women, Taken Down By Developer
Many pointed out that the DeepNude app did not work on men.
New Delhi:

A highly controversial application that used artificial intelligence to generate nude images of women based on their clothed photographs has been taken down by the developer following a massive backlash.

The developers of "DeepNude" announced on Twitter on Thursday that they had created the project for "entertainment" and never thought it would become "viral". "We greatly underestimated the request," the developer who goes by the alias "Alberto" said.

The application, which was provided as a download for a $50 fee, made it horrifying easy to create realistic nude photographs without any form of consent. It sparked outrage across the internet for the garish invasion of privacy and objectification of women. Many pointed out that the app did not work on men.

After months of floating underground, the app drew attention following a report by technology news website Motherboard this week. The creator of the app told another website The Verge that he believed someone else would soon make an app like DeepNude if he didn't do it first. "The technology is ready (within everyone's reach)," he said.

DeepNude is the latest in line of worrying applications made from "machine learning" - a form of artificial intelligence - where the technology industry has found itself on morally contentious ground. Since late 2017, the internet was been divided over "Deepfake" videos which use a similar technique to generate clips, usually of prominent people saying things that they have not said.

Track Latest News Live on NDTV.com and get news updates from India and around the world

Follow us:
Listen to the latest songs, only on JioSaavn.com