Fake Celebrity Endorsements: Spot AI Deepfakes
How to Check If a Celebrity Endorsement Is Real or a Deepfake
A retiree in Arizona saw a Facebook ad featuring what appeared to be Elon Musk endorsing a new cryptocurrency platform. The ad included a photo of Musk holding a phone displaying the app, along with a quote about how "this platform will change everything." The ad looked professional. The photo looked real. She invested $5,000 through the linked platform. The photo was a deepfake — Musk had never endorsed the platform, and the "investment" site was designed to steal her money. She lost everything she deposited.
Deepfake celebrity endorsements have become one of the fastest-growing tools in online fraud. Scammers use AI to generate realistic photos and videos of celebrities promoting crypto platforms, miracle products, and investment schemes. According to researchers, deepfake scam ads featuring celebrities increased by over 400% between 2024 and 2025. Our AI Image Detector can help you identify these fakes before you fall for them.
How Celebrity Endorsement Scams Work
The playbook is straightforward:
- Scammers create a deepfake image or video of a well-known celebrity using AI tools. Popular targets include Elon Musk, Jeff Bezos, MrBeast, Oprah Winfrey, and other high-profile figures.
- The fake endorsement promotes a product or platform — usually a cryptocurrency investment, a "money-making system," a miracle health product, or a financial tool.
- The ad runs on social media — Facebook, Instagram, YouTube, TikTok, or as sponsored content on news sites.
- Victims click through to a professional-looking website where they deposit money, share credit card details, or download malicious software.
- The money disappears, the product is fake, or the software steals personal information.
Celebrities Commonly Targeted
Scammers gravitate toward celebrities that people associate with wealth, technology, and success:
- Elon Musk: The most impersonated person in crypto scams. Fake Musk endorsements for Bitcoin platforms, AI trading bots, and giveaway schemes are everywhere.
- MrBeast: His association with giveaways makes him a perfect target for "free money" scam ads.
- Jeff Bezos: Fake endorsements for investment platforms and "Amazon insider" trading tools.
- Oprah Winfrey: Used to promote fake health products, weight loss supplements, and financial platforms.
- Mark Zuckerberg: Ironically used in scam ads that run on his own platforms.
- Local news anchors: Scammers also deepfake local news personalities to create fake news segments endorsing products.
How to Screenshot Suspicious Celebrity Content
When you see a celebrity endorsement that seems surprising or too good to be true:
- Screenshot the ad or post before it disappears. Scam ads are often taken down quickly and replaced with new ones.
- Capture the celebrity's face and body at the largest size available. Tap to expand the image if possible.
- Screenshot any text overlay or quotes attributed to the celebrity.
- Save the URL of the page or platform being promoted.
- If it's a video, take screenshots of frames where the celebrity's face is clearly visible.
How to Upload to the AI Image Detector
Check the celebrity endorsement before you believe it:
- Open our AI Image Detector on your phone or computer.
- Upload the screenshot of the celebrity image from the ad or post.
- Wait for the analysis. The detector examines the image for deepfake indicators and AI manipulation.
- Review the results — the tool will flag areas of concern and provide a confidence score.
- Check multiple screenshots if the ad uses several celebrity images.
Signs of a Deepfake Celebrity Image
Facial Anomalies
- Unnatural skin texture: Deepfake faces often have overly smooth skin that lacks the natural pores, wrinkles, and texture variations of the real person.
- Lighting mismatches: The lighting on the celebrity's face may not match the lighting in the rest of the image. Shadows may fall in the wrong direction or be missing entirely.
- Edge artifacts around the face: Where the deepfake face meets the neck, hair, or background, there are often subtle blending artifacts — blurring, color shifts, or unnatural transitions.
- Eye and teeth irregularities: Eyes may have inconsistent reflections, and teeth may appear too uniform or slightly merged.
Body and Context Issues
- Head-body mismatch: The celebrity's head may appear slightly too large or small for the body, or the skin tone of the face may differ from the neck and hands.
- Impossible scenarios: The celebrity in a setting or wearing clothing that doesn't match their public persona or known activities.
- Props that don't exist: The celebrity "holding" a product or phone that has distortions, warping, or impossible geometry.
- Wrong proportions: Hands, fingers, and accessories near the face often show AI generation artifacts.
Ad and Content Red Flags
- Urgency language: "Limited time offer," "Act now before it's gone," "Only 50 spots left."
- Unverified social media accounts: Real celebrity posts come from verified accounts with millions of followers.
- Comments are disabled or fake: Scam ads often disable comments or fill them with bot accounts praising the product.
- The platform or product is unknown: If you've never heard of the investment platform or product outside this one ad, be very skeptical.
The Reality Check
Remember these facts when you see a celebrity endorsement:
- Celebrities don't DM strangers with investment tips or business opportunities.
- Real endorsement deals are public and announced through official channels, not random social media ads.
- No legitimate celebrity would endorse an unregistered investment platform or a "guaranteed returns" scheme.
- If a celebrity were actually endorsing something, it would be covered by major news outlets, not just one social media ad.
Check It With Our AI Image Detector
Deepfake technology is advancing rapidly, and scammers are using it to put words in celebrities' mouths and products in their hands. These fake endorsements are designed to borrow trust — you trust the celebrity, so you trust the product. Don't let that borrowed trust cost you money. The next time you see a celebrity endorsing a crypto platform, investment opportunity, or miracle product, screenshot the image and upload it to our AI Image Detector. The tool detects deepfake artifacts, AI manipulation, and image editing that are invisible to the casual viewer. Celebrities don't DM you investment tips — always verify.
Courtney Delaney
Founder, ScamSecurityCheck
Courtney Delaney is the founder of ScamSecurityCheck, dedicated to helping people identify and avoid online scams through AI-powered tools and education.
Learn moreSupport Our Mission
ScamSecurityCheck is built to protect people from online fraud. Your contribution helps us keep building free security tools and resources.
