12 min read

Why You Can’t Stop Scrolling—And How It’s Warping the Truth

Why can’t you stop scrolling? Your brain’s dopamine system, manipulative algorithms, and engineered lies are conspiring to keep you hooked—and spreading misinformation from COVID conspiracies to election fraud. Learn how to break the cycle and reclaim your attention.
Photo of male hands holding an iPho
Photo by Marjan Grabowski / Unsplash

Raise your hand if this sounds familiar: You open Twitter/X to fact-check a headline, and suddenly you’re 15 tabs deep into a conspiracy thread about aliens or election fraud. You’re not weak-willed—you’ve just been outgunned by a system that hijacks brain chemistry, weaponizes algorithms, and bankrolls deception.

Misinformation isn’t spreading because people are dumb. It’s spreading because trillion-dollar platforms are designed to favor lies over facts. Let’s break down why—and how to fight back.

Dopamine Traps: How Clickbait Conspiracies Hijack Your Brain

Let’s start with a universal truth: Your brain loves novelty. It’s why you instinctively turn toward a car honking or click on a headline like “YOU WON’T BELIEVE WHAT HAPPENS NEXT!" This isn’t you being gullible; it’s biology. Enter dopamine, a neurotransmitter often (and misleadingly) called the “feel-good chemical.” In reality, dopamine isn’t about pleasure; it’s about anticipation. This is key; keep this in mind as you read through the article. It’s the brain’s “Wait, something important might happen here—pay attention!” signal.

How Scrolling Hijacks This System
Every time you refresh your feed, your brain treats it like a mini-science experiment: Will this swipe bring a funny meme? A shocking update? A juicy piece of gossip? Social media platforms exploit this by delivering unpredictable rewards—a viral clip here, a polarizing hot take there. Neuroscientists call this a “variable reward schedule,” and it’s the same mechanism that keeps gamblers glued to slot machines.

But here’s where misinformation sneaks in. Studies show that content triggering high-arousal emotions (outrage, fear, surprise) releases more dopamine than neutral information. For example:

  • A 2018 MIT study found false news spreads 6x faster than factual content on X/Twitter, partly because it’s often more novel and emotionally charged. I'm sure this doesn't surprise you.
  • Conspiracy theories like (“The election was stolen!”) or sensational claims (“Miracle cure ignored by doctors!”) hijack this dopamine-driven curiosity loop. They act like “mind bombs,” flashy, emotionally explosive ideas designed to hijack attention before critical thinking kicks in.

Why Your Brain Struggles to Resist
Dopamine doesn’t just make you want to keep scrolling; it also weakens your impulse to fact-check. When you’re in a dopamine-chasing state (researchers call this “hot cognition”), your brain prioritizes speed over accuracy. You’re more likely to:

  • Share a post because it feels true (even if it’s not).
  • Remember shocking claims longer than boring facts (a phenomenon called salience bias). I know I am guilty of this, and I am sure a lot of you are as well.
  • Trust information that aligns with your existing beliefs (thanks, confirmation bias). This is probably the toughest one for us to guard against.

Here’s the scary part: Bad actors know this. Troll farms, extremist groups, and even clickbait marketers deliberately craft lies that mimic dopamine-triggering patterns. A fake news headline isn’t just a lie; it’s a neurological trap.

The Real-World Cost
This isn’t just about wasting time. When dopamine-driven clicks reward misinformation, algorithms learn to push more of it. The result? During the COVID-19 pandemic, the WHO warned of an “infodemic,” a flood of false cures and conspiracy theories that directly undermined public health efforts. Your brain’s ancient wiring, optimized for spotting lions on the savanna, is now being used to sell digital snake oil.

Now let's take a look at how algorithms turn this biological vulnerability into an endless supply of outrage.


Algorithms—The Invisible Puppeteers Behind Your Feed

You just searched for “vegan brownie recipes.” Harmless, right? But within hours, your feed is flooded with posts about “Big Food’s war on plant-based diets,” a documentary questioning climate science, and a meme claiming Bill Gates wants to ban meat. Wait, how did we get here? Take a look at the following documentary; it will give you some really fascinating insights into the world of algorithms.

The answer lies in algorithms, the secret rulebooks that decide what you see online. Contrary to popular belief, these algorithms aren’t neutral. They’re built for one job: keep you scrolling at all costs, even if that means feeding you lies.


How Algorithms Fuel Misinformation

  1. Personalization = Polarization
    • Algorithms track everything: your clicks, pauses, shares, even how long your thumb hovers over a post.
    • Over time, they build a “profile” of your biases, fears, and interests. A 2023 Pew Research study found 71% of TikTok users regularly encounter videos they never actively searched for but that align with their inferred views.
    • Example: Watch one video questioning vaccine safety, and the algorithm assumes you’re “interested” in anti-science content. Cue more conspiracies.
  2. Outrage Sells—And Algorithms Know It
    • Platforms like Facebook and X/Twitter prioritize “engagement” (likes, comments, and shares) over truth.
    • A leaked 2021 Facebook internal report admitted “misinformation outperforms truth” because it’s more provocative.
    • False claims about elections, climate change, or health crises generate 4x more clicks than fact-based posts, per a 2022 Nature study.
  3. The Rabbit Hole Effect
    • Algorithms use “recommendation chains” to radicalize subtly. Start with a mainstream political post, and soon you’re getting “soft” conspiracy takes (“Why aren’t they telling us the full story?”). A few clicks later, you’re in Flat Earth territory.
    • YouTube’s algorithm, for instance, was found (in a 2020 Mozilla Foundation audit) to recommend climate denial videos 70% more often than factual content after users watched just one skeptical video.

The Real-World Cost of “Optimized” Feeds

  • Eroding Trust in Institutions: When algorithms amplify fringe voices, moderate viewpoints drown. A 2023 Stanford study showed users who relied on social media for news were 50% less likely to trust scientists or elections.
  • Profiting From Chaos: Meta (Facebook’s parent company) made $116 billion in ad revenue in 2022—much of it from engagement-driven misinformation. As ex-Facebook employee Frances Haugen testified in 2021, "The platform’s algorithms exploit human weakness.”

The Big Lie Platforms Tell Us

They claim, “We’re just a mirror of society!” But mirrors don’t curate reality. When you see a steady drumbeat of divisive content, it’s not a coincidence—it’s a business model.

Let's take a look at how tech’s “deception playbook” keeps us hooked—and what we can do about it.


The Deception Playbook—How Platforms and Bad Actors Manipulate Reality

You’ve probably seen posts warning, “They don’t want you to know this!” or “Share this before it’s deleted!” It’s a classic manipulation tactic—but it works because it’s paired with a much bigger lie: the myth that social media is a neutral “town square.” In truth, every feature of these platforms is designed to exploit human psychology, and bad actors have become experts at gaming the system.


1. Dark Patterns: The Art of Digital Trickery

Platforms use subtle design tricks to nudge you toward spreading misinformation, often without you realizing it:

  • Deceptive Urgency: “1.2M people have viewed this!” counters or timers (“Trending for 3 hours!”) pressure users to engage before fact-checking.
  • Ambiguous Buttons: A “Share” button is easier to click than a “Fact-Check This” button (which rarely exists).
  • Endless Scroll: Removing natural stopping points keeps users in a dopamine-chasing trance. As former Google ethicist Tristan Harris warned, “It’s like slot machines in a Skinner box.”

2. Weaponized Bots and Troll Farms

  • Fake Consensus: During the 2020 U.S. election, researchers found that 65% of election fraud tweets came from just 1% of accounts—many of them bots. These accounts artificially inflate engagement, making lies feel mainstream. Just take a look at this article from Rolling Stone if you have any doubts: https://www.rollingstone.com/culture/culture-features/elon-musk-twitter-misinformation-timeline-1235076786/
  • Emotional Contagion: A 2022 Stanford study showed that fake accounts (like Russia’s Internet Research Agency) often pose as hyper-partisan Americans, using outrage to provoke real users into spreading their content.

3. The Profit Motive: Why Truth Loses

  • Ad Dollars Over Accuracy: Platforms earn money per click, regardless of content. For example, anti-vaxxers spent $3.5 million on Facebook ads in 2020, spreading debunked claims to millions.
  • The “Liar’s Dividend”: Even when fake content is flagged, the damage is done. Deepfakes (AI-generated videos) and doctored images thrive because they generate revenue—and platforms face little accountability.

Case Study: How a Lie Becomes “Truth”

Let’s take a look at how the false “Pizzagate” conspiracy spread in 2016:

  1. Seed: A fake news story claimed a D.C. pizza shop hid a child trafficking ring.
  2. Amplify: Bots and partisan influencers shared it with #SaveTheChildren hashtags.
  3. Legitimize: Algorithms pushed it to users interested in “child safety” or “corruption.”
  4. Escalate: A real person, armed with a rifle, stormed the shop.

Tragically, the story didn't end here, in January 2025, Edgar Welch, the Pizzagate gunman was killed by police in North Carolina during a traffic stop: https://www.cbsnews.com/news/pizzagate-gunman-fatally-shot-by-north-carolina-police-edgar-welch/


The Gaslighting Effect

When users are bombarded with conflicting claims (“Is the Earth flat? Decide for yourself!”), it creates information paralysis. Over time, people either:

  • Disengage: “I can’t trust anything!”
  • Retreat to Tribalism: “My group’s truth feels safer.”

The result? A 2023 U.S. Surgeon General advisory called health misinformation a “public health crisis,” linking it to vaccine hesitancy and preventable deaths.


The Fallout—When Misinformation Poison Goes Viral

Picture this: You’re trying to decide whether to get a new vaccine. You Google it. Half the results say it’s life-saving; the other half claim it’s a plot to “depopulate the Earth.” You ask friends. Their answers split down ideological lines. Exhausted, you give up or default to whatever feels right. This isn’t just confusion. It’s what happens when misinformation erodes society’s ability to agree on basic facts.


1. Psychological Toll: The Anxiety of “Reality Flu”

  • Decision Paralysis: A 2023 APA study found 67% of adults feel overwhelmed by conflicting online claims, leading to “information avoidance” (shutting down).
  • Hypervigilance: Constant exposure to threats (“THEY’RE COMING FOR YOUR KIDS!”) keeps brains in fight-or-flight mode. Chronic stress from doomscrolling is now linked to sleep disorders and depression.
  • Gaslighting Ourselves: When lies and truth blur, people distrust their own judgment. Example: Cancer patients delaying treatment after reading false “natural cure” testimonials.

2. Fractured Reality: No Common Truth, No Compromise

  • Tribal Epistemology: People no longer agree on how to know what’s true. A 2024 Pew study showed 58% of Americans distrust academic experts, preferring “my own research” (i.e., Google deep dives).
  • Polarization as a Product: Algorithms profit by pitting groups against each other. In Brazil, fake claims about election fraud, amplified by Telegram, led to violent riots in 2023—a mirror of the U.S. Capitol attack.
  • Erosion of Democracy: When voters can’t agree on facts, policymaking stalls. Climate change? Gun control? “Debates” become unwinnable culture wars.

3. Life-and-Death Consequences

  • Public Health: The WHO estimates 5.8 million COVID deaths could have been prevented if not for vaccine hesitancy fueled by misinformation.
  • Violence: Conspiracy theories radicalize. The 2022 Buffalo supermarket shooter cited the “Great Replacement” theory—a false claim spread by mainstream algorithms.
  • Economic Collateral: False rumors can crash markets. In 2023, a viral AI-generated image of an “explosion at the Pentagon” briefly wiped out $500 billion in stock value.

The Generational Divide

  • Gen Z’s Cynicism: Young people, raised on algorithmic feeds, are 3x more likely than Boomers to believe “all news is biased” (Columbia University, 2023).
  • Lost Trust: Institutions like media, science, and government aren’t just distrusted—they’re seen as enemies. Rebuilding this trust could take decades.

The Bottom Line: Misinformation isn’t just “fake news.” It’s a toxin that weakens mental health, democracy, and global stability. And without intervention, the dose keeps increasing.


Fighting Back—How to Reclaim Your Attention (and Sanity)

Let’s be honest: You’ll never fully “opt out” of the attention economy. But you can starve the algorithms, rewire your habits, and protect yourself from the worst of misinformation. Here’s how:


1. Personal Armor: Small Shifts, Big Impact

  • The 10-Second Rule: Before sharing, ask: “Would I bet $100 this is true?” This pause disrupts dopamine-driven impulsivity.
  • Curate Your Cues: Turn off ALL non-essential notifications. Research shows just hearing a ping triggers dopamine cravings.
  • Go Grayscale: Switching your phone to black-and-white reduces the “slot machine” appeal of colorful apps (pro tip: works great for night scrolling).
  • Follow the Fact-Checkers: Bookmark sites like Snopes, Reuters Fact Check, or NewsGuard to quickly vet suspicious claims.

2. Tech Tweaks: Outsmart the Algorithms

  • Block the Rabbit Hole: Use tools like:
    • Unhook (YouTube): Removes recommendations and autoplay.
    • News Feed Eradicator (Facebook): Replaces your feed with a calming quote.
    • TikTok Jester (browser extension): Flags accounts tied to misinformation campaigns.
  • Search Like a Pro: Add “site:.gov” or “site:.edu” to Google queries to bypass sketchy sources. Example: “COVID vaccine safety site:.gov”.
  • Diversify Your Diet: Follow accounts you disagree with (within reason). Algorithms hate this—it confuses their profiling.

3. The Big Fixes: Changing the System

  • Demand Transparency: Support laws like the EU’s Digital Services Act, which forces platforms to reveal how algorithms work.
  • Push for “Friction”: Urge platforms to add speed bumps before sharing unvetted content. Instagram now asks, “Are you sure?” if you try to repost a flagged post—this should be standard.
  • Starve the Trolls: Hit misinformation where it hurts—the wallet. Use ad blockers to deny platforms revenue, or support boycotts of brands that advertise on toxic feeds.

The Hope Spot

A growing “slow media” movement is pushing back:

  • Reddit’s r/NoSurf: 500K+ members supporting digital minimalism.
  • Mastodon and Bluesky: Decentralized platforms where users control algorithms.
  • Schools Teaching “Lie Detection”: Finland’s curriculum now includes media literacy as young as kindergarten, cutting susceptibility to propaganda by 65%.

The Bottom Line: You’re not powerless. Every time you close a tab instead of rage-sharing, use an ad blocker, or demand accountability, you’re hacking a system designed to hack you.


Rewriting the Rules of the Attention Economy

Let’s return to where we started: You, your phone, and those stolen hours lost to the scroll. But this time, picture a different ending. You read a controversial post, your thumb hovers, but instead of sharing, you screenshot it and text a friend: “This feels off. Let’s dig deeper.” You close the app, take a walk, and later fact-check the claim. Small act, big ripple.

The battle against misinformation isn’t about perfection. It’s about awareness—recognizing that every click, like, and share is a vote for the world you want to live in. Yes, dopamine will keep pulling you toward outrage. Algorithms will keep peddling lies. And yes, rebuilding trust in institutions will take years. Systemic change is possible when enough people demand it.

The Road Ahead:

  • For Individuals: Embrace “mindful scrolling.” Your attention is currency—spend it wisely.
  • For Tech: Platforms must prioritize ethics over engagement. Imagine a TikTok that rewards critical thinking.
  • For Society: Treat misinformation like a public health crisis. Fund media literacy like we fund math classes.

This isn’t naive optimism. It’s resistance. Every time you question a headline, mute a bot, or lobby for transparency, you’re loosening the grip of dopamine traps and algorithmic puppeteers.

Final Thought: The next time you feel that familiar itch to scroll, remember: You’re not just fighting your brain. You’re fighting a trillion-dollar industry. But history shows us that even the mightiest systems crack under collective pressure. The endless scroll can end—but only if we choose to look up.


References for Further Learning

If you're interested in learning more, here is a list of credible books, studies, documentaries, and tools to dive deeper into the science, ethics, and solutions around dopamine, algorithms, and misinformation:


Books

  1. “Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked” by Adam Alter
    • Explains how tech companies exploit behavioral psychology.
  2. “Weapons of Math Destruction” by Cathy O’Neil
    • Breaks down how algorithms perpetuate bias and inequality.
  3. “The Misinformation Age: How False Beliefs Spread” by Cailin O’Connor and James Owen Weatherall
    • Examines the social dynamics behind fake news.
  4. “Stolen Focus: Why You Can’t Pay Attention” by Johann Hari
    • Discusses the attention crisis and systemic solutions.

Key Studies & Reports

  1. MIT Study on Misinformation Spread (2018)
  2. Facebook’s Role in Amplifying Divisive Content (2021)
    • The “Facebook Files” leaked by whistleblower Frances Haugen.
  3. Stanford Social Media Lab
  4. Nature Study on Engagement vs. Accuracy (2022)

Documentaries & Podcasts

  1. “The Social Dilemma” (Netflix)
    • Tech insiders explain how platforms manipulate users.
  2. “The Great Hack” (Netflix)
    • Explores Cambridge Analytica’s misuse of data.
  3. “Your Undivided Attention” (Podcast)
    • Hosted by the Center for Humane Technology.
  4. “Rabbit Hole” (The New York Times)
    • Investigates how YouTube radicalizes users.

Tools & Initiatives

  1. NewsGuard
    • Browser extension that rates news sites for credibility.
  2. AllSides
    • Shows how different political outlets report the same story.
  3. Mozilla’s RegretsReporter
    • Crowdsources data on harmful YouTube recommendations.
  4. Screen Time Settings
    • Use built-in phone features to set app limits (iOS/Android).

Organizations to Follow

  • First Draft News (combats misinformation)
  • Data & Society (researches tech’s societal impact)
  • Center for Countering Digital Hate
  • Electronic Frontier Foundation (digital rights advocacy)

Note: Always cross-check sources, especially when exploring polarizing topics—even well-meaning experts can have biases! The key is to stay curious, skeptical, and open to updating your views.