how-to-spot-a-deepfake-2026

Stop Looking at the Hands: How to Actually Spot a Deepfake in 2026

We are living in the “Fog of War.” With Minneapolis currently under a curfew and social media flooded with conflicting videos of the Alex Pretti shooting, my DMs are full of people asking: “Is this video real?” Here is the scary truth: The advice you learned in 2024 is useless. “Look at the hands”? AI fixed that last year. “Look for blinking”? The new Sora 2 and Runway Gen-4 models mastered blinking months ago. Here is what to do to Spot a Deepfake in 2026.

If you are relying on old tricks, you are going to get fooled. And right now, getting fooled is dangerous. Here are the 4 advanced tells that still work in 2026—and the one tool you need to install today.

1. The “Side Profile” Glitch (The Ear Test)

AI is incredible at generating faces looking forward. It still sucks at turning heads.

  • The Tell: Watch the ears and the jawline when a subject turns their head to the side.
  • What to look for: In 90% of AI videos, the ear will blur, change shape, or “detach” slightly from the jaw as the head rotates. It’s a texture mapping error that models like Kling and Hailuo still haven’t solved.
  • The Rule: If they never turn their head, be suspicious. If they do, stare at the ear.

2. Check for the “CR” Pin (The Nutrition Label)

Stop guessing and start scanning. As of late 2025, major camera makers (Sony, Nikon, Canon) and platforms (TikTok, YouTube) adopted the C2PA Standard (Coalition for Content Provenance and Authenticity).

  • The Tool: Look for the small “CR” (Content Credentials) icon in the corner of the video player or image.
  • How to use it: If you click that pin, it shows you the “Digital Nutrition Label.” It will tell you if the image was captured by a camera or generated by software.
  • Note: If a viral video on X (Twitter) has stripped the metadata (which X often does), treat it as “Unverified” immediately.

3. The “Breathing” Test (Audio)

Deepfake audio (Voice Cloning) is actually harder to spot than video now, but it has one flaw: Breathing.

  • The Tell: AI voice models are trained on speech, not biology. They often forget to inhale.
  • What to listen for: In a real angry rant or emotional plea, a human gasps for air. They pause. Their pitch cracks.
  • The AI Flaw: AI voices are often “too perfect.” The cadence is rhythmic, and the breathing sounds are either missing or inserted at weird, grammatically correct intervals rather than biological ones.

4. The Physics of “Solids”

AI understands light, but it doesn’t understand physics.

  • The Tell: Watch clothing and hair.
  • What to look for: Does the collar of their shirt melt into their neck? Does a strand of hair pass through their glasses instead of over them?
  • The Glitch: This is called “clipping.” In the viral “Militia” video circulating yesterday, you can see a rifle strap merge into the man’s shoulder. That was the giveaway that it was fake.

The Verdict

We are entering an era where “Seeing is Believing” is a dangerous mindset. Skepticism is your best firewall. If a video makes you feel an extreme emotion (rage, fear, shock) instantly, pause. That is a feature, not a bug. It was designed to do that.

Don’t share it until you verify it.


Related Links:

Leave a Comment

Your email address will not be published. Required fields are marked *