Why Your Brain Believes Lies (Even When You Know Better)

Let’s set the scene: Imagine a foreign government wants to throw another country’s election into chaos. They don’t send spies or hackers—they just flood social media with fake news stories, Photoshopped images, and conspiracy theories. These aren’t random lies, though. They’re targeted to hit the exact pressure points of society—like pitting political groups against each other or stoking fear about hot-button issues. This isn’t just “oops, someone shared the wrong meme.” It’s disinformation—a deliberate, military-grade weapon to mess with people’s heads.
But here’s the weird part: Even when the lies are obvious (“No, Karen, the moon landing wasn’t faked”), they still work. Why?
Turns out, our brains are kinda like old software with some glitchy code. We’ve got these mental shortcuts—called cognitive biases—that help us make quick decisions (like trusting a friend’s restaurant recommendation). But these shortcuts can be hacked by bad actors to slip fake ideas into our heads.
Let’s dig into why this happens—and how to armor-plate your brain against it.
Thanks for reading Disinfaux: Dissecting the Psychology of Disinformation! Subscribe for free to receive new posts and support my work.
Your Brain’s “Glitches” (And How Trolls Exploit Them)
Our brains didn’t evolve for TikTok and 24/7 news cycles. Back in caveman days, mental shortcuts kept us alive. If your buddy yelled, “LION! RUN!”, you didn’t stop to fact-check—you ran. Fast-forward to today, and those same shortcuts make us easy prey for disinformation.
Here’s the problem:
- We’re lazy thinkers (no offense). We prefer ideas that fit what we already believe.
- Drama sticks in our brains way more than boring facts.
- We trust people who seem like “our tribe”—even if they’re secretly Russian bots.
- We think we’re smarter than we are (thanks, Dunning-Kruger effect!).
Disinformation pros know this. They’re like psychological pickpockets, using our brain’s quirks against us. Let’s break down their sneaky tactics.
Real-World Mind Games: How Biases Get Weaponized
1. Confirmation Bias: “See? I Told You So!”
How it works: We love info that confirms what we already think. Disinfo campaigns feed this addiction by creating echo chambers.
Example: During the 2016 U.S. election, Russian trolls ran thousands of fake social media accounts. They’d blast conservatives with “Hillary’s running a child trafficking ring!” and liberals with “Trump’s a Russian puppet!” Both sides ate it up because it felt true—even with zero proof.
Why it works: It’s like a dopamine hit for your brain. The more you see “your side” winning, the less you question the source.
2. Availability Heuristic: “OMG, This Is EVERYWHERE!”
How it works: We judge how likely something is based on how easily we can remember examples. Enter: viral lies.
Example: Remember those “COVID vaccines are killing people!” hoaxes? They used graphic (fake) stories and emotional language to stick in your mind. Even if you knew vaccines were safe, that one scary meme about a “healthy teen dying” made you hesitate.
Why it works: Fear hijacks your brain. If you see 10 posts about vaccine deaths (even fake ones), you’ll think it’s a bigger risk than it is.
3. Bandwagon Effect: “Everyone’s Doing It!”
How it works: We follow the crowd—even if the crowd is made of bots.
Example: In France’s 2017 election, Russian bots retweeted pro-Le Pen content thousands of times a day. Undecided voters saw her trending and thought, “Wow, she’s popular—maybe I should vote for her too.”
Why it works: Social proof is powerful. If something looks popular, we assume it’s legit.
4. The Dunning-Kruger Effect: “I Googled It, So I’m an Expert!”
How it works: The less we know about a topic, the more we overestimate our expertise.
Example: During the pandemic, armchair “experts” spread wild theories about 5G towers causing COVID. They’d cite a YouTube video or a shady blog, convinced they’d “done the research.”
Why it works: Confidence is contagious. If someone sounds sure of themselves (even if they’re clueless), we’re more likely to believe them.
The Secret Sauce: How Social Media Fuels the Fire
Disinfo campaigns don’t just rely on brain hacks—they’re turbocharged by social media algorithms. Here’s the dirty secret:
- Algorithms love drama. Platforms like Facebook and X (Twitter) prioritize posts that get strong reactions (angry comments, shares). Guess what’s super shareable? Outrageous lies.
- Echo chambers are profitable. The more time you spend arguing online, the more ads you see. Social media companies have little incentive to fix this.
- Bots are everywhere. Studies suggest up to 15% of Twitter accounts are bots. They amplify divisive content, making fringe ideas seem mainstream.
How to Fight Back: Train Your Brain (Like a Jedi)
You’re not doomed! Here’s how to build your mental armor:
1. Become a Fact-Checking Ninja
- Pause before sharing. Ask, “Would I bet $100 this is true?” If not, don’t share it.
- Use fact-checking sites: Snopes, FactCheck.org, and Reuters Fact Check are your friends.
- Reverse-image search: Suspect a photo’s fake? Drag it into Google Images—you’ll often find it’s from a 2012 meme.
2. Break Your Echo Chamber
- Follow people you disagree with. If you’re liberal, follow a conservative thinker (and vice versa). You don’t have to agree—just see how the other side thinks.
- Ditch algorithm-driven feeds. Try apps like Flipboard or RSS feeds to curate your own news.
3. Spot Emotional Manipulation
- Watch for loaded language. Phrases like “WAKE UP SHEEPLE!” or “This will DESTROY America” are red flags.
- Check your pulse. If a post makes you furious or terrified, take a breath. Disinfo thrives on knee-jerk reactions.
4. Call Out Fake Stuff (Without Being a Jerk)
- Reply with receipts. If your uncle shares a conspiracy theory, respond with a link to a fact-check. Keep it chill: “Hey, I saw this was debunked here. Thought you’d want to know!”
- Report harmful content. Most platforms let you flag fake news or hate speech. For now anyway.
5. Boost Your Media Literacy
- Learn the tricks. Take a free online course (like Crash Course’s Navigating Digital Information on YouTube).
- Play “Spot the Bot.” Fake accounts often have no profile pics, weird usernames (@TruthWarrior3791), and only post divisive content.
This Isn’t Just Your Problem—It’s Everyone’s
Fighting disinfo isn’t just about saving yourself. It’s about protecting your family group chats, your community, and democracy itself. Here’s how we can tackle this together:
- Push for transparency. Demand social media companies reveal who’s behind political ads and how algorithms work.
- Support quality journalism. Subscribe to local newspapers or trustworthy outlets. No ads? No paywall? Be suspicious.
- Teach the next generation. Schools should teach media literacy like they teach math.
The Bottom Line
Disinformation works because it’s designed to hack our brains. But here’s the good news: Awareness is kryptonite to manipulation. By staying curious, skeptical, and open to changing your mind, you can outsmart the trolls.
Next time you see a viral post, ask yourself, “Is this true, or does it just feel true?” Your brain—and the rest of us—will thank you.
Stay sharp, stay kind, and don’t let the bots win. 💪
TL;DR: Disinfo preys on brain glitches we all have. Fight back by fact-checking, diversifying your info diet, and staying calm when content tries to rage-bait you. And maybe hide your aunt’s Facebook share button.
Thanks for reading Disinfaux: Dissecting the Psychology of Disinformation! This post is public so feel free to share it.