Critikid Logo

The Science of Scientific Mistrust

by Chloe Lipton

Why are people falling for pseudoscience, rejecting experts, and trusting TikTok over decades of research?

In recent years, science has gone from being a trusted authority to a target for conspiracy theories, wellness influencers, and people who genuinely believe Google knows more than decades of research. Scroll through the comment section on any scientific post and you’ll see the classics: “You can’t trust the science,” “Scientists are always changing their minds,” or my personal favourite, “Do your own research.” (Which usually means watching a six-minute video by a man named Greg who once studied crystals…and not for their mineral properties)

At first glance, it’s tempting to dismiss it all as harmless or just fringe nonsense. But over the last decade, that scepticism has started to spread. And not just quietly. It’s shouting through megaphones, selling supplements, and telling millions of followers not to trust doctors or scientists because they’re “part of the system.”

But here’s the thing: a lot of this mistrust isn’t coming from stupidity or malice. It’s coming from a deep — and, in a way, understandable — misunderstanding of what science actually is.

So let’s talk about it. Let’s talk about why science feels exclusive, how that’s fuelling a boom in pseudoscience and conspiracies, and what we can do to fix it.

“You can’t trust the science – they’re always changing their minds!”

Let’s start with this one, because it’s probably the most common refrain I’ve heard over the past few years. And I get it. When official advice suddenly flips or headlines scream that something we thought was good is now bad (or vice versa), it can feel like scientists don’t know what they’re doing.

But that’s actually the point.

Science isn’t a static list of facts carved into stone tablets. It’s a method, a process. It’s a glorious, occasionally chaotic, always-self-correcting way of inching closer to the truth. And yes, that means sometimes scientists are “wrong” — or more accurately, less right than they later become.

During the early days of COVID, scientists told people masks weren’t necessary for the general public. Then they changed their tune. Cue accusations of flip-flopping and incompetence. But the reality? New data came in and our understanding evolved. That’s not failure. That’s science working exactly as it should.

When people say, “See? They were wrong!” they’re imagining science as a kind of all-knowing oracle. But it’s not. It’s more like a highly trained detective, constantly reviewing the clues and updating the theory. Being open to new evidence isn’t a weakness — it’s the strength of the whole system. Scientific theories are only ever as good as the information we have at the time, and good science adapts when that information changes.

“The science” isn’t a person. Or a club. Or a conspiracy.

One of the problems we’re facing is that science still feels exclusive to a lot of people — like a private members’ club for the clever or elite. Even though many of the traditional barriers are coming down, the perception of exclusivity hasn’t caught up.

And that perception matters.

If you were put off science in school, or made to feel like it wasn’t for “people like you,” it’s not surprising if you grow up resenting it a bit. Or tuning it out. Or feeling like scientists are always talking down to you, speaking in jargon, or hiding behind big words to keep the gates shut.

This is where pseudoscience creeps in. Because while science might occasionally feel cold or confusing, pseudoscience is often warm, simple, and easy to understand. It offers clear answers. It uses just enough “sciencey” language to sound credible. And it validates your instincts rather than challenging them.

Even more dangerously, pseudoscience often feels inclusive. It says, “You don’t need a fancy degree. You already know the truth. Trust your gut.”

See the difference?

Why pseudoscience feels good (and science sometimes doesn’t)

Let’s talk psychology for a second.

Pseudoscience offers certainty, control, and hope — especially in situations where people feel powerless. If you’ve been failed by the healthcare system, if you’ve watched a loved one suffer despite “following the science,” if you’re scared and confused… a simple, comforting answer is incredibly appealing.

Now imagine someone online — friendly, attractive, persuasive — tells you that Big Pharma is lying, but they’ve got the truth. It’s natural. It’s ancient. It’s holistic. And best of all, it works because they tried it and they’re fine now.

Even if it’s nonsense, it’s compelling nonsense.

One of the most infamous examples is Belle Gibson — a wellness influencer who built an empire claiming she cured her brain cancer through diet and detoxes. Except… she never had cancer. At all. It was all a lie. But people believed her — some abandoned their medical treatments to follow her advice — because she made them feel seen, empowered, and hopeful.

It’s deeply human to want certainty, especially when the world feels uncertain — but learning to live with complexity and nuance is part of what makes science, and people, resilient. Pseudoscience may offer simplicity, but it often does so by flattening the truth. Science, frustrating as it may be at times, asks us to sit with the grey areas — and that’s not weakness. That’s honesty.

When “doing your own research” goes wrong

I’ll be honest — I’m all for curiosity. Google away. Ask questions. Get weird about Wikipedia rabbit holes.

But there’s a difference between being curious and thinking you’re now an expert after three YouTube videos and a Reddit thread.

One of the more worrying findings from recent surveys is that people with the strongest anti-science attitudes often believe they understand science better than everyone else — even when they fail basic science literacy tests.

It’s a kind of overconfidence that feels empowering. But it’s also dangerous. Because it leads people to reject expert consensus in favour of fringe ideas that confirm what they already want to believe.

And once someone’s in that space — watching content that reinforces the same message over and over — it becomes very hard to reach them. Social media algorithms don’t show you the other side. They show you more of what you already think. It creates a bubble, and inside that bubble, misinformation feels like truth.

We’ve never had more information at our fingertips — but without the tools to sift the good from the garbage, people end up overwhelmed and vulnerable to the loudest voice, not the most accurate one. That’s why media literacy matters. We need to teach people how to evaluate claims, spot manipulation, and identify trustworthy sources — especially when someone online is saying what they already want to hear.

Confirmation bias plays a big role here — we’re all wired to notice evidence that supports our beliefs and ignore what doesn’t. That’s not a flaw in character, it’s just psychology. But recognising that bias is the first step to thinking more critically — and resisting the pull of misinformation dressed up in a lab coat.

So what do we do?

We don’t fix this with more shouting or more facts. We fix it with better conversations, more inclusive communication, and an honest, transparent approach to science that welcomes people in instead of making them feel stupid for not already knowing the answer.

That means:

  • Being clear about what we know, what we don’t, and what we’re still figuring out.
  • Admitting when advice changes — and explaining why.
  • Valuing curiosity over certainty.
  • Teaching science not just as a list of facts, but as a way of thinking critically and compassionately about the world.
  • Listening more, not just “talking louder.”
  • Using plain language. Jargon is the enemy of trust.
  • Making science feel like something people can join, not something they were left out of.

Because the more we help people feel like science is for them, the less likely they are to fall for the people trying to sell them snake oil.

Final thoughts

Science isn’t perfect. It can be slow, messy, and sometimes frustrating to watch — especially when the advice changes or the answers feel uncertain. But that’s not failure. That’s how progress works.

What makes science powerful isn’t that it’s always right, but that it’s willing to be wrong — and to learn from it.

That’s not something to fear or dismiss. It’s something to admire.

So the next time someone says “scientists don’t know what they’re talking about,” maybe ask — is that because they changed their minds? Updated their guidance? Revised their conclusions?

Because that’s not a reason to lose trust.

That’s exactly why they’ve earned it.

Sources and Further Reading

  • Pew Research Center – “Public Trust in Scientists and Views on Their Role in Policymaking” (2024)
  • Cologna et al., Nature Human Behaviour (2025) – “Trust in scientists and their role in society across 68 countries”
  • University of Bath – “New global research reveals strong public trust in science” (2025)
  • The Guardian – “Belle Gibson really pretended to have cancer…” (2025)
  • Gray et al., PLOS Biology (2023) – “People with more extreme attitudes towards science have high self-confidence in their understanding of science, even if it is not justified”
  • Wellcome Global Monitor (2018 & 2020)

An extended version of this article appears on Chloe's Substack.

About the Author

Chloe Lipton is an engineer, STEM communicator, and passionate advocate for diversity in science and tech. She creates short-form videos across TikTok, YouTube, and Instagram, where she breaks down complex topics and sparks curiosity with bite-sized, jargon-free explanations. Chloe also writes Everyday STEMinist on Substack, exploring how science works, why it matters, and who it leaves out — with a mission to make STEM more accessible, inclusive, and engaging for everyone.


Courses

Fallacy Detectors

Fallacy Detectors

Develop the skills to tackle logical fallacies through a series of 10 science-fiction videos with activities. Recommended for ages 8 and up.

US$15

Social Media Simulator

Social Media Simulator

Teach your kids to spot misinformation and manipulation in a safe and controlled environment before they face the real thing. Recommended for ages 9 and up.

US$15

A Statistical Odyssey

A Statistical Odyssey

Learn about common mistakes in data analysis with an interactive space adventure. Recommended for ages 12 and up.

US$15

Logic for Teens

Logic for Teens

Learn how to make sense of complicated arguments with 14 video lessons and activities. Recommended for ages 13 and up.

US$15

Emotional Intelligence

Emotional Intelligence

Learn to recognize, understand, and manage your emotions. Designed by child psychologist Ronald Crouch, Ph.D. Recommended for ages 5 to 8.

US$10

Worksheets

Logical Fallacies Worksheets and Lesson Plans

Logical Fallacies Worksheets and Lesson Plans

Teach your grades 3-7 students about ten common logical fallacies with these engaging and easy-to-use lesson plans and worksheets.

US$10

Symbolic Logic Worksheets

Symbolic Logic Worksheets

Worksheets covering the basics of symbolic logic for children ages 13 and up.

US$5

Elementary School Worksheets and Lesson Plans

Elementary School Worksheets and Lesson Plans

These lesson plans and worksheets teach students in grades 2-5 about superstitions, different perspectives, facts and opinions, the false dilemma fallacy, and probability.

US$10

Middle School Worksheets and Lesson Plans

Middle School Worksheets and Lesson Plans

These lesson plans and worksheets teach students in grades 5-8 about false memories, confirmation bias, Occam’s razor, the strawman fallacy, and pareidolia.

US$10

High School Worksheets and Lesson Plans

High School Worksheets and Lesson Plans

These lesson plans and worksheets teach students in grades 8-12 about critical thinking, the appeal to nature fallacy, correlation versus causation, the placebo effect, and weasel words.

US$10

Statistical Shenanigans Worksheets and Lesson Plans

Statistical Shenanigans Worksheets and Lesson Plans

These lesson plans and worksheets teach students in grades 9 and up the statistical principles they need to analyze data rationally.

US$10