Dolly’s Dream backs the eSafety Commissioner’s recent request to the eight most popular social media and messaging platforms to release data on the number of Australian children using their platforms. This includes what protections they have in place to safeguard their youngest users from potential harms.
The request comes at a time of increased community discourse and concern around the potential negative impacts of social media and online harms on Australian children. This includes how exposure to harmful content can escalate into bullying and other forms of online abuse.
The Commissioner’s request includes data relating to Australian children’s use of popular platforms such as Instagram, Snapchat and TikTok and what – if any – age assurance measures these tech platforms have in place to enforce their own age limits such as needing to be at least 13 years to access and use Instagram.
Having access to this data will enable the eSafety Commission to keep tech platforms accountable to community expectations and outline the steps they take to keep their users safe online, including bullying and other harmful interactions.
CEO of Dolly’s Dream, Sarah Davies AM said that children and young people continue to be at risk of online harms because the tech platforms they are using are not designed to keep them safe.
“Part of the problem is that social media platforms operate on a commercial model that seeks to maximise the handling of individuals' personal data. These platforms are designed to grab and hold their users' attention and to encourage as much interaction as possible. In practice, this leads to platforms that put children at risk because they are low privacy, distracting and highly compelling, feeding users content that is potentially unsafe and making them vulnerable to unsolicited contact.”
Recent research shows that two-thirds of 14–17-year-olds have viewed potentially harmful content online in the past year including drug use, self-harm and violent images (1).
Even when parental supervision is high, children still face risks (2). Of Australian teens whose parents take steps to restrict their tech use, 28% have still seen ‘gore’ content online, 27% have seen drug content, 22% have seen eating disorder content, and 14% have seen suicide content.
We know that parents care deeply about their children’s safety online, but they often feel overwhelmed and struggle to manage the risks that arise, including the risk of bullying. Only half of Australian parents feel in control of their children’s data privacy online (3), with most parents of teens saying their children understand tech better than they do (4).
“When it comes to keeping children safe online, we need a multi-pronged approach and a broader safety net – it cannot be the sole responsibility of parents and carers to protect their children from online harms. Tech platforms must take responsibility, and we are calling on them to take immediate steps to address the underlying causal factors that make their platforms inherently unsafe for children,” explained Ms Davies.
“And it must go beyond simple age assurance because research (5) shows that half of children aged 3-12 use at least one social media app or site – this tells us that children are finding ways around current measures to determine age,” she added.
Significant changes are needed to make the digital environment a safe place for children. That’s why Dolly’s Dream is calling for a Children’s Online Privacy Code to restrict what digital platforms are allowed to do with children’s personal information. It is digital platforms’ hunger for people’s data and attention that drives many of the safety risks.
“What we need is age-appropriate safety by design that prioritises the best interests of children over commercial interests. It should be standard practice that all tech, games and apps default to the highest privacy and safety settings for children under 18 years, backed by enforceable regulation that ensures their data privacy and protection from the outset,” Ms Davies said.
This preventative approach would minimise potential harms, giving parents and carers peace of mind and gift children an internet where they are free to live, learn and play safely.
Multi-pronged approach to keep children safe online
Keeping children and young people safe online takes collective effort – from government, tech companies, and everyone in the community.
- We believe it’s government’s responsibility to ensure children’s digital rights are upheld and realised – setting minimum standards based on community expectations and holding tech companies to account for not meeting these standards.
- It’s tech companies’ responsibility to prevent their services and products from being used in ways that violate children’s rights and which expose children to harms while using their services and platforms.
- And it’s up to the rest of us to take responsibility to upskill and educate ourselves and our children on how to navigate tech and the online world safely and confidently; to participate with them in their digital worlds, not just police them.
The most urgent issues that need to be addressed
To address the underlying causal factors, we would mandate:
- Default Privacy & Safety Settings: all digital products and services must automatically provide the highest level of privacy and safety settings for users under 18 years, by default.
- Data: data is the currency for many tech services and products, but we must ban the commercial harvesting / collection / scraping / sale / exchange / stealing of children’s data.
- Privacy: prohibit behavioural, demographic and bio tracking and profiling of children, and therefore no profiled, commercial advertising to children under 18.
- Recommender Systems (the algorithm): ban the use of recommender systems – software that suggests products, services, content or contacts to users based on their preferences, behaviour, and similar patterns seen in other users.
- Reporting Mechanisms: require child-friendly, age-appropriate reporting mechanisms and immediate access to expert help on all platforms.
- Independent Regulation: remove self and co-regulation by the tech industry and establish a fully independent regulator with the power and resources to enforce compliance.
- Safety by Design: implement age-appropriate and safety-by-design requirements for all digital services accessed by children.
- Public Data Access: ensure public access to data and analytics for regulatory, academic, and research purposes.
It is also vital that we continue developing and delivering digital literacy and data privacy education in schools and the broader community, and we must always consult with children and young people.
[1] https://www.esafety.gov.au/research/mind-gap
[2] https://www.esafety.gov.au/research/mind-gap
[4] https://www.accce.gov.au/sites/default/files/2021-02/ACCCE_Research-Report_OCE.pdf
[5] https://www.ofcom.org.uk/sitea...