Netflix's Adolescence only scratches the surface of online harms facing young people

03 Apr 2025
netflix adolescence
Written by Sarah Davies AM, Chief Executive Officer of the Alannah & Madeline Foundation and Dolly’s Dream

Within a week of the new Netflix series Adolescence dropping in mid-March, it was all anyone was talking about. It has become something of a cultural phenomenon, sparking conversations about online misogyny, image-based abuse and cyberbullying. In the UK, where the series was produced, it has broken Netflix records, and Prime Minister Keir Starmer has even voiced his support for a campaign to make it compulsory viewing in schools. Clearly, Adolescence has touched a nerve.

There are multiple themes running through the storyline – all a sobering social commentary on how we are thinking about and treating each other. Violence against and attitudes towards women (as in Australia, more than one woman a week is killed in England and Wales by a current or former partner) is both subtly portrayed (through the wife’s stewardship of her husband’s emotional regulation, and the differing approaches of male and female teachers) and violently portrayed (as in the key dramatic storyline). The question of how, as a society, we are nurturing and raising boys through adolescence into manhood screams at us in every scene. As Hunter Johnson, co-founder of The Man Cave, so eloquently puts it “(Teenage boys) are trying to figure out how to be good men in a world that hasn’t shown them how.”

But the theme the Alannah & Madeline Foundation is most qualified to dissect and comment on, is the impact of technology on our children and young people. We work with children and young people, every day, and with their families and teachers, right across the country. We hear what they are telling us, we understand what they are experiencing, and we are working to address the risks and harms, as well as building skills and competencies to take advantage of the strengths and opportunities which tech offers.

The show centres on the Miller family, whose lives are thrown into turmoil when one of their own, 13-year-old schoolboy Jamie, is accused of murdering a girl in his class. Jamie’s parents are initially unable to believe that their son could be involved in something so horrific. But when the investigation unearths incontrovertible evidence, they’re forced to confront the unthinkable.

A big part of what has made Adolescence so unnerving for parents is that the Miller family, and indeed Jamie, are totally relatable. They are presented as being like any other family, and Jamie is your typical teenaged boy – a little quiet, spends most of his time alone in his room online, but basically a “good kid.” There are no obvious warning signs that he has embarked on such a dark path. The implication? This could happen to any family.

One of the show’s writers, Jack Thorne, has called for stronger regulation of children’s use of technology in the UK. He has cited Australia’s ban on social media for children under 16 as an example of the kind of policy he would like to see adopted. “If it was my decision, I would be talking of smartphones like cigarettes and issuing an outright ban on all use by under-16s,” he told The Guardian UK, “but if that isn’t possible the digital age of consent is a fine alternative.”

At the Alannah & Madeline Foundation, we've long advocated for stronger regulations and greater public awareness of the very real risks young people face from harmful online influences. However, we’re not convinced that a ban on technology or social media is the answer. We don’t downplay online harms like cyberbullying and toxic masculinity; they are, quite frankly, alarming, and they frequently lead to tragic outcomes.

But technology also offers children and young people valuable opportunities for connection and learning. By simply denying access to digital spaces, we risk doing them a great disservice. An outright ban may also prove to be unrealistic, as technology is thoroughly entrenched in the fabric of our lives. Children are resourceful, and there is a strong likelihood that they would find ways around any proposed ban, making it far more difficult for us to have honest conversations about their technology use.

The other half of this equation is children and young people themselves; all too often, they are left out of these discussions. If we want to understand our children and their relationship to technology, it is crucially important that we have honest, regular conversations with them, beginning as early as possible and continuing as they grow. This doesn’t mean lecturing or interrogating but actively listening and taking an interest in what they are doing online.

Focusing on a social-media or phone ban also runs the risk of obscuring deeper issues with technology and distracts from the more nuanced conversations we should be having. Public discourse, like that we’ve been hearing off the back of Adolescence, tends to focus on what we call the visible harms; things like social media, AI deepfakes and cyberbullying, that can have overt, harmful impacts.

What we rarely talk about is the underlying system that drives those harms. The internet runs on a business model that is built to maximise profits, not to safeguard the wellbeing of users. Within this business model, capturing the attention and engagement of users is of the utmost importance, no matter if the content is harmful or even true.

Algorithms and recommender systems are part of that business model. When a user clicks or lingers on a particular type of content, the algorithm registers this and keeps offering up more of the same. The result is that users get fed an endless stream of content that, for whatever reason, has struck a chord, and they become stuck in an echo-chamber that reinforces a particular worldview. Scrolling a social media feed for hours every day might not be healthy for anyone – especially children – but it is certainly lucrative for those who own the platform.

The system doesn’t have to work this way. Misogyny, misinformation and cyberbullying aren’t inherent to digital technology; the problem is that adequate guardrails haven’t been put in place to stop their proliferation. Despite what Big Tech would have you believe, it is possible to maintain profitability without sacrificing children’s wellbeing, and we can and should demand better. Our vision is that online spaces are safe for children, and that they are allowed to thrive. And this vision is achievable if we have.
  • Safety by Design: implement age-appropriate and safety-by-design requirements for all digital services accessed by children, not just social media.
  • Default Privacy & Safety Settings: all digital products and services must automatically provide the highest level of privacy and safety settings for users under 18 years, by default.
  • Age-appropriate Content: ensure all content is aligned to the developmental stage, cognitive abilities and emotional maturity of the user based on age.

It is government’s responsibility to ensure children’s digital rights are upheld and realised – setting minimum standards based on community expectations and holding tech companies to account for not meeting these standards.

It is tech companies’ responsibility to prevent their services and products from being used in ways that violate children’s rights, and which expose children to harms while using their services and platforms.

And it is up to the rest of us to take responsibility to upskill and educate ourselves and our children on how to navigate tech and the online world safely and confidently; to participate with them in their digital worlds, not just police them.

Yes, the themes in the show are complex and intertwingled. No, there are no ‘quick fixes’. But we can avoid repeating past mistakes where children’s rights have been overlooked in social media and tech regulation.

Measures must be put in place to ensure that all technologies function to benefit children and not to harm or exploit them.

Get Advice

For practical advice and resources to help empower the children in your care to be safe online, please visit:
  • DigiTalk online safety hub for parents and carers.
  • Download the free cyber safety Beacon App.
  • eSafety, Australia's independent regulator for online safety.

Dolly's Dream sits within the Alannah & Madeline Foundation. 
 
Dolly’s Dream was created by Tick and Kate Everett following the shattering loss of their 14-year-old daughter, Dolly, to suicide, after ongoing bullying. Tick and Kate’s goal is to prevent other families walking this road. They want to change cultures and behaviours to prevent bullying, by increasing understanding of the impact of bullying, anxiety, depression, and youth suicide and by providing support to parents.
The Alannah & Madeline Foundation shares this goal and has a long and successful track record of creating a safer online and offline world for children.
United by this common purpose, they formed a partnership to educate parents on the problems of bullying, its effects and how to deal with it; and then to empower children and adults to recognise bullying when it occurs and have the confidence and skills to stand up and talk about it.

Leave a Comment