Trends-UK

Shah Rukh Khan Videos Leading Indians into Costly Scams?

AI generated content is spreading quickly across social platforms, and almost 90 percent of Indians have encountered fake or AI generated celebrity content this year, according to McAfee’s annual “Most Dangerous Celebrity: Deepfake Deception List.”

The report states that cybercriminals are misusing celebrity names and likenesses to deceive people and extract money. On average, victims lose about Rs 34000 in such scams.

ADVERTISEMENT

The data shows that Shah Rukh Khan is the most exploited celebrity face in India, followed by Alia Bhatt, Elon Musk, Priyanka Chopra Jonas and Cristiano Ronaldo. The global list includes celebrities like YouTuber MrBeast, Lionel Messi, Taylor Swift, Kim Kardashian and members of BTS.

Scammers use AI generated deepfakes of these celebrities to convince victims to transfer money through fake endorsements, giveaways and misleading links to scam websites. With just three seconds of someone’s voice, fraudsters can now create convincing audio deepfakes without consent.

The survey was conducted online in August by McAfee. It studied the impact of these scams on consumers across Australia, France, Germany, India, Japan, the United Kingdom and the United States, and included around 8,600 adults.

The rise of these fraud schemes is worrying because people have access to countless original photos, videos and audios of celebrities, which can be misused for illegal activities. Many Indian celebrities are already taking legal action to protect themselves from AI misuse by enforcing personality rights that safeguard their name, image, likeness and voice.

These legal measures, such as injunctions and take down orders, help prevent commercial exploitation, deepfakes and digital impersonation. However, the bigger issue is that there are still no strong laws to prevent the creation of celebrity deepfakes.

The survey stresses the need for stricter guidelines from social media platforms regarding AI generated content and how far it can reach.

ADVERTISEMENT

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button