Expect the Lie
Originally a 2–3 min video — also on LinkedIn / TikTok / YouTube · @allemaar
We use cookies to understand how you use this site and improve your experience.
Originally a 2–3 min video — also on LinkedIn / TikTok / YouTube · @allemaar
Fifty-eight percent of people surveyed across forty-eight markets say it is getting harder to tell what is true and false online. That is from the Reuters Institute's 2025 Digital News Report. Nearly a hundred thousand people, across six continents. More than half are walking into the feed unsure whether the ground is solid.
Think about what that actually means. You open an article. You see a post. You watch a clip. And before you've read the first sentence, some part of you has already decided it's probably fake. Not wrong. Not biased. Fake. That's a new default. Disbelief as the starting position, not the conclusion.
Start with text, because text got here first. An article that looks like it came from a real desk. A comment under it that sounds like a real person. A thread of a hundred voices, all pitched to sound like someone you'd know. Some are. Some aren't. The feed does not ask what something is. It asks whether it moves. Real testimony, synthetic outrage, edited footage, bot replies, copied journalism, generated slop. Once they enter the stream, they compete under the same rules.
And it isn't just text anymore. Video is the same now. A politician appears to say something they never said. A protest clip loses its date, its place, its origin. A voice on a phone call sounds exactly like someone you trust, asking for money. This is not an American election problem. It is a feed problem. The tools got cheap. The quality got good. The volume went vertical.
Everyone is blaming the fakers. That's a content frame. Somebody bad made a bad thing, so we should punish the bad somebody. Fine. But it doesn't fix anything. Because the deeper problem isn't that bad people make fakes. The deeper problem is that nothing in the media itself shows you where it came from. An article has no seam. A clip has no grain. Most voices carry no durable proof of origin. Once something is in the feed, the feed treats real and synthetic exactly the same. That's not a content problem. That's a format problem. The thing we are looking at does not carry its own history.
And here's where it bends. When disbelief becomes the default, real footage gets dismissed as fake too. A real video of a real event. A real recording of a real voice. Someone will say it's AI. And enough people will agree. Not because the evidence is bad, but because disbelief is easier than belief now. The liar wins by default. The truth doesn't need to be refuted. It just needs to be doubted.
This isn't going to get quieter. It's going to get louder. More articles. More posts. More clips. More voices. The volume is the whole strategy. You can't out-read it. You can't out-watch it. Your gut was calibrated on a world that doesn't exist anymore.
But there are signals, if you know where to look. Not perfect signals. Not magic detectors. Patterns. Origins. Chains of custody. The future of trust will not come from believing harder. It will come from media that can prove where it has been.
Loading comments…