top of page

Bursting the Algorithmic Bubble: Teaching Students to Think Critically Online

Let’s imagine this: You’re scrolling through Instagram. Every post feels like it was made just for you funny reels, familiar faces, even things you were just thinking about. It feels convenient, almost magical.

But what if that convenience came at a cost? A hidden cost called the algorithmic bubble.

What is an Algorithmic Bubble?

The term refers to the invisible feedback loop created when algorithms (especially on social media, streaming platforms) serve you content based on your past interactions and preferences.

At first, it seems helpful. Over time, though, it limits exposure to new and diverse perspectives.



Visual diagram showing the algorithmic content loop: user clicks, content is tailored, leading to more clicks, and continuous algorithm learning. Represents how algorithms shape online experiences

                                                   

How did we get here? Modern platforms run on Machine Learning Algorithms that work on one thing: Data.

Every click, pause or movement you make is a signal to it. The more you interact, the more the algorithm learns about you, predicting your choices almost perfectly.

Let’s break it down:

  1. Instagram wants you to keep scrolling

  2. Youtube wants you to watch the next video.

  3. Amazon wants you to keep buying.

To achieve this, algorithms provide content that feels familiar and engaging, even when it is biased or incomplete.

                                     

Illustration of a concerned boy inside a digital bubble surrounded by icons of social media, news, shopping, and video platforms, symbolizing the algorithmic filter bubble effect.

Real-Life Example: Hari’s Political Echo Chamber

Hari, a 17-year-old student, had recently taken an interest in politics. She followed accounts that matched her progressive views. Soon, her feed was full of posts that echoed her beliefs. It felt reassuring, as though everyone agreed with her.


Then, during a class discussion, a friend introduced a completely different viewpoint. Hari realised she had never seen anything like it in her feed. Curious, she began following a wider range of voices. Over time, her social media began to reflect more variety.


She came to understand that her online world had been filtered, and she had not even noticed.

This is how algorithmic bubbles function. They keep us comfortable by reinforcing our existing beliefs, but that comfort can stop us from engaging with different ideas and learning to think critically.


  •  A 2018 study by MIT found that people are 70% more likely to share false or emotionally charged content if it aligns with their views. 


Algorithms exploit this natural bias to maintain engagement.


Students are especially vulnerable to algorithmic bubbles often without realising it.


Why This Matters in Your Classroom

Students are constantly online. They search for homework help, watch tutorials, follow influencers and scroll through endless threads of content. Most of the time, they trust what they see.


But they rarely ask questions like:


  • Who created this, and why?

  • Is it accurate or simply popular?

  • What might be missing from my feed?


These are the kinds of questions that build critical thinking. The good news is that they can be taught  and it does not require deep technical knowledge.

                   

How Algorithmic Bubbles Impact Students


  • Academic Research: When searching online, algorithms tend to show sources that align with your past interests. A student researching might only see articles that support one side, limiting critical thinking.


  • Social Awareness: Social media shapes views on politics, identity, and current events. If your feed only reflects your beliefs, it can falsely feel like everyone thinks the same way you do.


  • Mental Health: Platforms show emotionally engaging content. For students, that might mean endless posts that trigger comparison, stress, or anxiety, especially around grades, appearance, or popularity.


  • Career Exploration: Watch a few videos about medicine, and suddenly your feed is full of only that blocking out paths you might’ve loved but never discovered.


It's like revising the same page everyday. Familiar? Yes. But you are not growing more than what you already know. 


Are Algorithmic Bubbles Always bad?

No. YouTube suggesting tutorials based on your need? very helpful. Amazon reminding you to buy a new toothpaste? Very convenient.

The problem arises when the algorithmic bubbles become the only lens you see the world through.


So What Can You Do About It?


Simple Strategies to Bring into Your Classroom

Helping students understand algorithmic bubbles doesn’t require fancy tech or complex tools. A few thoughtful activities can go a long way in developing their critical thinking. Here are four that you can easily try:


1. What’s In Your Feed? (Feed Analysis)

Arrange two phones and ask students to open YouTube or Instagram homepages and take a closer look. Encourage them to reflect on questions like:


  • What types of videos or posts are showing up?

  • Why might this content be appearing for them specifically?

  • What kind of voices or viewpoints might be missing?


This opens their eyes to how personalisation shapes their online experience.


2. Can You Trust This? (The CRAAP Test)

Teach students a simple method for evaluating online information. The CRAAP Test stands for:


  • Currency – Is the content current or outdated?

  • Relevance – Does it actually relate to what they’re looking for?

  • Authority – Who created it, and are they credible?

  • Accuracy – Can the facts be verified by other reliable sources?

  • Purpose – Is it trying to inform, sell, persuade or entertain?


3. Whose Side Is This On? (Spot the Bias)

Present two articles or videos on the same issue but from different sources. Ask students to:


  • Compare tone, language and imagery.

  • Identify what each source includes or leaves out.

  • Discuss which side seems favoured, and why.


This helps them develop a sense for media bias and learn to look beyond surface impressions.


4. Is This Even True? (Figuring out misinformation)


Take a popular claim or headline and ask students to investigate. Have them:

  • Research it using trusted sources.

  • Check facts on websites like Full Fact or Snopes.

  • Present what they found and how they verified it.


This builds confidence in questioning viral content and not taking things at face value.

→ These activities help students go beyond passive scrolling and start engaging actively with what they see online. 


→ They encourage learners to question the sources, intentions and accuracy of digital content.


→  By doing so, students begin to recognise how algorithms influence their perspectives and develop habits of critical thinking.


Moving Forward

Helping students think critically online does not mean avoiding social media. It means helping them understand how it works and how to question it.


As a teacher, you have the power to shape how students engage with the digital world and build their resilience against misinformation.


At KCITE, we believe digital literacy is essential. Our teacher training programmes equip you with the tools to create classrooms where students are curious, thoughtful and informed.


Quick Recap

  • Algorithms influence what students see online

  • This leads to bubbles of limited or biased content

  • Students must learn to question, compare and verify information

  • You can teach these skills through simple, engaging classroom strategies


References:



 
 
 

Comments


bottom of page