San Francisco, January 26: Google-owned YouTube has deleted more than 1,000 deepfake scam ad videos of celebrities from its platform. YouTube said it is “investing heavily” to stop AI celebrity scam ads. After a 404 Media probe into such fake celebrity ads, YouTube deleted more than 1,000 videos tied to an advertising ring that used AI to make celebrities like Taylor Swift, Steve Harvey, and Joe Rogan promote Medicare scams.

Such videos had nearly 200 million views, with both users and celebrities regularly complaining about them, said the report. YouTube is “aware” that its platform is being used with AI-generated ads of celebrities, and is working hard to stop such celebrity deepfakes. iOS 17.4 Beta 1: Apple Introduces Alternative App Marketplaces in EU, Drops 30% Commission to 17% and More; Check List of New Changes.

The YouTube action came as non-consensual deepfake porn of Taylor Swift went viral on X, with one post garnering more than 45 million views and 24,000 reposts before it was removed. The post was live on the platform for around 17 hours prior to its removal. X New Feature Update: Elon Musk-Run Platform To Launch Dedicated ‘Video Tab’ on Mobile Soon.

A report from 404 Media found that the images may have originated in a group on Telegram, where users share explicit AI-generated images of women. Users in the group also reportedly joked about how the images of Swift went viral on X. According to the latest research from cybersecurity firm Deeptrace, about 96 per cent of deepfakes are pornographic, and they almost always portray women.

(The above story first appeared on LatestLY on Jan 26, 2024 11:09 AM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website