MANGALURU: The ‘
Saree Challenge’ has been going viral, with
social media flooded photos of sari-clad women. However, reports suggest that these pictures of women in their six yards of elegance can be used for catfishing.
According to reports,
Deepnude.com, a website that was under fire last year for creating
fake nude pictures of clothed people is back. Cyber experts warn the ‘Saree Challenge’ pictures uploaded on social media give endless opportunities to
cyber criminals using this website.
“It’s time to not just stay home and stay safe, but also to be cyber safe,” said Ananth Prabhu G, professor, Sahyadri College of Engineering and Management and cyber security trainer..
The original version of DeepNude — a machine-learning application that replaced women’s clothing in photos with an approximation of what they would look like underneath to create fake naked images of them — was seemingly abandoned after its developers, based in Estonia, realised, with the help of some online outrage, that their misogynistic creation could be misused.
According to Prabhu, Deep Fake content is created by using two competing artificial intelligence algorithms — the generator and the discriminator. “The generator, which creates the phoney multimedia content, asks the discriminator to determine whether the content is real or artificial. So it’s more like a war between the generator and discriminator,” he said.
“As the discriminator gets better at spotting fake video, the generator gets better at creating them. Now, the same technology is used to create DeepNude software, which uses a photo of a clothed person and creates a new, naked image of the person,” he added. There are tools that can detect Deep Fakes and Deep Nudes. “However the initial damage done cannot be reversed easily,” Prabhu warned.
The fake photographs can either end up in pornographic websites or can be used to blackmail or harass women.