New Delhi, October 21: A bot based on artificial intelligence (AI) has been developed by unidentified cybercriminals to target women. The bot lets anyone create fake nude images of women from a regular picture, according to a cyber research agency tracking deepfakes. The service has been used to target at least 1,00,000 women online. A deepfake is manipulated media in which face or voice of the original person is swapped by someone else by using artificial intelligence. Couple Challenge Pictures on Facebook Can be Misused For Revenge Porn and Deepfake Crimes Warns Pune Police; Here's How to Keep Your Personal Photos Secured on FB.

"Our investigation of this bot and its affiliated channels revealed several key findings. Approximately 104,852 women have been targeted and had their personal “stripped” images shared publicly as of the end of July, 2020. The number of these images grew by 198% in the last 3 months," Sensity, a Netherlands-based cyber research agency, said in its report released on Tuesday. The service has been used by nearly 1,04,000 people, mostly Russians, the report added. BJP Used 'Deepfake' in Delhi Assembly Elections 2020? Report Shows Videos of Manoj Tiwari Made Using Face/Voice-Swapping Technology.

The bot allows a person to upload a picture of the woman whom s/he wants to target. On the user's request, the bot can delete any clothing and replace it with fake skin and private parts that appear original but are not. According to the Sensity's report, a water-marked fake nude photo can easily be made through the service without any cost. Users can pay $1.5 (about Rs 110) to remove the watermark, the report said. The tool appears to be a version of DeepNudes, a software first released anonymously in 2019.

"On July 19th 2019, the creators sold the DeepNude licence on an online marketplace to an anonymous buyer for $30,000. The software has since been reverse-engineered…" Sensity said. "The bot’s significance, as opposed to other tools for creating deepfakes, is its accessibility, which has enabled tens of thousands of users to non-consensually strip these images," Henry Ajder, the lead author of the report, was quoted by Hindustan Times as saying.

Earlier, the technology would often be used to create fake pornographic content of celebrities. But, now any woman whose images are available online can be targetted. The tool can be used for the revenge porn tactic.

(The above story first appeared on LatestLY on Oct 21, 2020 08:30 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website