
A confronting new Netflix drama, Adolescence, has reignited national conversation about how social media is influencing young people—and not always for the better.
The series follows a 13-year-old boy accused of murdering a female classmate, raising unsettling questions about online culture, masculinity, and the quiet power of social media algorithms. While fictional, the themes feel uncomfortably close to home for many parents, carers and educators.
At the centre of the concern is the so-called “algorithm effect”—the invisible systems that decide what content appears on our screens.
Algorithms are sets of instructions used by digital platforms to sort, rank and recommend content. They analyse enormous amounts of data—including what users like, watch, click on, or linger over—to personalise feeds and suggestions. Their primary goal is to keep people engaged and scrolling.
While not inherently harmful, this focus on engagement can come at a cost. Algorithms can amplify sensational, extreme or emotionally charged material, nudging young people towards content that promotes unrealistic beauty standards, violence, misogyny or hate speech.
Recommender systems power most major platforms, including TikTok’s “For You” page, Instagram Reels, YouTube suggestions, Netflix recommendations and Spotify playlists. Once a young person shows interest in a topic—fitness, gaming or self-improvement—the system often pushes more intense or polarising versions of that content.
For boys, this can mean exposure to violent pornography, extreme misogyny or rigid ideals of masculinity such as “looksmaxxing”. For girls, innocent searches about health or fitness can spiral into harmful beauty ideals, disordered eating content or material linked to self-harm.
The impact builds over time. Infinite scrolling and autoplay can encourage dependency, making it harder for adolescents to disconnect. Echo chambers form, reinforcing narrow worldviews and distorting expectations around relationships, success and self-worth.
Understanding how algorithms work is a key step in protecting young people.
Parents, carers and educators are encouraged to talk openly with children about how their feeds are shaped and why certain content appears. Setting boundaries around screen time, encouraging regular breaks, and diversifying online experiences beyond social media can all help.
Practical steps include using platform settings to manage privacy, apply content filters, limit time spent online, and reporting harmful material to platforms or to eSafety.gov.au.
Australia’s eSafety Commissioner has also made it clear that responsibility should not fall solely on families. Online platforms are expected to meet the country’s Basic Online Safety Expectations, taking reasonable steps to ensure recommender systems do not promote illegal or harmful content.
Technology isn’t going away. But with awareness, conversation and shared responsibility, Burdekin families can help young people navigate the digital world with greater confidence—and fewer unseen influences shaping who they become.
Algorithms can amplify sensational, extreme or emotionally charged material, nudging young people towards content that promotes unrealistic beauty standards, violence, misogyny or hate speech. Photo credit: Adobe Stock