This article is from the source 'bbc' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at https://www.bbc.co.uk/news/technology-48799045

The article has changed 3 times. There is an RSS feed of changes available.

Version 1 Version 2
App that can remove women's clothes taken offline App that can remove women's clothes from images shut down
(32 minutes later)
An app that claimed to be able to digitally remove the clothes from pictures of women to create fake nudes has been taken offline by its creators.An app that claimed to be able to digitally remove the clothes from pictures of women to create fake nudes has been taken offline by its creators.
The $50 (£40) Deepnude app won attention and criticism because of an article by tech news site Motherboard.The $50 (£40) Deepnude app won attention and criticism because of an article by tech news site Motherboard.
One campaigner against so-called revenge porn called the app "terrifying".One campaigner against so-called revenge porn called the app "terrifying".
The developers have now removed the software from the web saying the world was not ready for it.The developers have now removed the software from the web saying the world was not ready for it.
"The probability that people will misuse it is too high," wrote the programmers in a message on their Twitter feed. "We don't want to make money this way.""The probability that people will misuse it is too high," wrote the programmers in a message on their Twitter feed. "We don't want to make money this way."
Anyone who bought the app would get a refund, they said, adding that there would be no other versions of it available and withdrawing the right of anyone else to use it.Anyone who bought the app would get a refund, they said, adding that there would be no other versions of it available and withdrawing the right of anyone else to use it.
The developers also urged people who had a copy not to share it, although the app will still work for anyone who owns it.The developers also urged people who had a copy not to share it, although the app will still work for anyone who owns it.
Revenge victimsRevenge victims
The team said the app was created as "entertainment" a few months ago.The team said the app was created as "entertainment" a few months ago.
This development work led them to set up a website offering Windows and Linux versions of the app. The program was available in two versions - a free one that put large watermarks over created images and a paid version that put a small "fake" stamp on one corner.This development work led them to set up a website offering Windows and Linux versions of the app. The program was available in two versions - a free one that put large watermarks over created images and a paid version that put a small "fake" stamp on one corner.
In their statement, the developers added: "Honestly, the app is not that great, it only works with particular photos."In their statement, the developers added: "Honestly, the app is not that great, it only works with particular photos."
Despite this, the interest generated by the Motherboard story led the app owner's website to crash as people sought to download the software.Despite this, the interest generated by the Motherboard story led the app owner's website to crash as people sought to download the software.
Speaking to Motherboard, Katelyn Bowden, founder of anti-revenge porn campaign group Badass, called the app "terrifying".Speaking to Motherboard, Katelyn Bowden, founder of anti-revenge porn campaign group Badass, called the app "terrifying".
"Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo," she told the site. "This tech should not be available to the public.""Now anyone could find themselves a victim of revenge porn, without ever having taken a nude photo," she told the site. "This tech should not be available to the public."
The program reportedly uses AI-based neural networks to remove clothing from images of women to produce realistic naked shots.The program reportedly uses AI-based neural networks to remove clothing from images of women to produce realistic naked shots.
The networks have been trained to work out where clothes are in an image, mask them by matching skin tone, lighting and shadows and then fill in estimated physical features.The networks have been trained to work out where clothes are in an image, mask them by matching skin tone, lighting and shadows and then fill in estimated physical features.
The technology is similar to that used to create so-called deepfakes, which manipulate video to produce convincingly realistic clips. Early deepfake software was used to create pornographic clips of celebrities.The technology is similar to that used to create so-called deepfakes, which manipulate video to produce convincingly realistic clips. Early deepfake software was used to create pornographic clips of celebrities.