McAfee, a global leader in online protection, has released its annual Most Dangerous Celebrity: Deepfake Deception List, revealing how cybercriminals are increasingly using the names, faces, and voices of top celebrities to scam consumers. According to the 2025 report, Shah Rukh Khan ranks #1 as the most exploited celebrity in AI-driven deepfake scams across India, followed by Alia Bhatt and Elon Musk.
The findings highlight a rapid surge in the misuse of celebrity likenesses—across both Indian and global personalities—to promote fake endorsements, fraudulent giveaways, and phishing schemes designed to lure users into scam websites or malicious downloads.
90% of Indians Have Encountered Fake or AI-Generated Celebrity Endorsements
McAfee’s research reveals alarming trends in the spread of deceptive AI-generated content:
-
90% of Indians say they have seen fake or AI-created celebrity endorsements online.
-
Victims reported losing an average of ₹34,500 to such scams.
-
60% have witnessed deepfake content featuring influencers and online creators, not just mainstream celebrities—showcasing how fast misinformation and fraudulent content are evolving.
With generative AI advancing at unprecedented speed, scammers can now replicate a person’s voice using as little as three seconds of audio. These hyper-realistic deepfakes are being widely used to imitate celebrities promoting:
-
Skincare and beauty products (42%)
-
Giveaways and gift schemes (41%)
-
Crypto or trading investments (40%)
-
“Must-have” gadgets and supplements
Top 10 Most Dangerous Celebrities | Deepfake Deception List (2025): India
|
|
Shah Rukh Khan | 6. | MrBeast |
|
|
Alia Bhatt | 7. | Lionel Messi |
|
|
Elon Musk | 8. | Taylor Swift |
|
|
Priyanka Chopra Jonas | 9. | Kim Kardashian |
|
|
Cristiano Ronaldo | 10. | Members of BTS |
“Deepfakes have changed the game for cybercriminals; they’re no longer hacking systems — they’re hacking human trust,” said Pratim Mukherjee, Senior Director of Engineering, McAfee. “India’s vibrant celebrity culture and massive online engagement make the threat even more dangerous. Technology can now effortlessly mimic the voices, faces, and mannerisms of people we admire. In a country where millions engage with celebrity and influencer content daily, such fakes can spread instantly. It’s becoming harder to tell what’s real and what’s not — making awareness, caution, and reliable protection tools more critical than ever.”
India is one of the most socially engaged digital populations in the world, with 95% using WhatsApp, 94% on YouTube and 84% on Instagram, is especially vulnerable to scams disguised as celebrity content. McAfee’s findings show that younger users are the most at risk: 62% of those aged 35–44 and 60% of 25–34-year-olds admitted to clicking on fake celebrity ads, compared to 53% among 18–24-year-olds. Scepticism increases with age, as only 46% of 45–54-year-olds and just 17% of those over 65 said they had ever fallen for such scams.
To help consumers fight back, McAfee combines education with AI-powered tools like McAfee’s Deepfake Detector, which analyses text, email, and video content to flag potential fakes – including deepfakes – and phishing attempts before they cause harm. As AI-generated media grows more convincing, these tools give people a way to verify what’s real before they click, share, or buy.
The findings reveal a clear trend: that scammers are exploiting the trust people place in celebrities and influencers. Protecting consumers has never been more critical. Tools like McAfee’s Deepfake Detector help users identify AI-generated videos and manipulated celebrity content before fake endorsements mislead them.
The celebrities on these lists are targets, not perpetrators. Scammers hijack their likenesses and voices, without consent, to exploit the trust people place in familiar faces.
Disclaimer: Reference to any individual mentioned in this report is purely for educational / information purposes only and does not imply any form of endorsement, involvement or affiliation with the malicious activities associated with their names.









