A deepfake bot generated ‘nude’ pictures of over 100,000 women

1
smartphone

INQUIRER.net stock photo

A new report from security firm Sensity revealed that 104,852 women were targeted in July by an artificial intelligence-powered bot that allowed users to create fake nudes of them from images on social media. The pictures were later shared and traded in various Telegram channels.

These seven affiliated channels on the messaging app attracted 103,585 users by the end of July, according to Sensity’s investigation.

“While this figure does not account for the likelihood that many members are part of multiple channels, the ‘central hub’ channel alone attracted 45,615 unique members,” researchers wrote in the report.

A large majority of the deepfake bot’s users (70%) have indicated that they are from Russia and other Eastern European countries. A statistic explained by the fact that, aside from Telegram, the bot and its affiliated channels were advertised on VKontakte, the largest social media platform in Russia.

Even more surprisingly, most members indicated that they were using the bot to target private individuals they knew in real life, rather than celebrities or influencers (63% versus 16%).

The bot used to create fake nudes of these women is an open-source version of DeepNude software, in which artificial intelligence “strips” images of clothed individuals by automatically generating a realistic approximation of their naked bodies.

DeepNude made headlines in June 2019 when its website was shut down by its developers just a day after it received mainstream press coverage, explaining that “the probability that people will misuse it is too high.” The creators later sold the DeepNude license to an anonymous buyer for $30,000, following which the software was reverse-engineered.

While Telegram’s deepfake bot is free to use, researchers discovered that individuals can pay for “premium coins” that remove watermarks from generated nudes, and skip the free user processing queue.

Although Sensity notes that users of this bot are likely primarily interested in consuming deepfake pornography, the report warns that bot-generated nudes can “be explicitly weaponized for the purposes of public shaming or extortion-based attacks.”

While deepfake technology has long been used to target celebrities and political leaders, Adobe has recently announced the release of a “beta” version of a functionality that will help photo editors assess the authenticity of a picture. According to CNN, this feature could indicate who created the image and where, and provide a thumbnail of the original image along with data about how it has been altered. RGA

RELATED STORIES:

Microsoft unveils ‘deepfake’ detector ahead of US vote

Researchers call for harnessing, regulation of AI

Read more...