Dolly's Dream acknowledges the Australian Government’s announcement yesterday that it plans to introduce legislation to enforce social media age limits – but warns that this alone won't ensure children's online safety, including when it comes to bullying.
While holding tech companies accountable is essential, age limits alone fail to address the underlying factors that make social media inherently unsafe for children and young people. Dolly's Dream urges the Government and tech platforms to take immediate steps to address the key underlying issues: data harvesting, privacy breaches, harmful recommender systems, and risks of bullying and harassment.
For too long, the burden of protecting children online has unfairly fallen on parents, teachers, and to children themselves. Social media platforms, driven by profit, are designed to capture and retain users’ attention and data, often at the expense of privacy and safety, and can inadvertently become breeding grounds for bullying.
To be effective, age limits require technologies to verify users' ages, which could compromise children's personal information if not carefully regulated. Instead of focusing solely on age restrictions, a broader safety net is needed. This includes enforcing a Children’s Online Privacy Code, mandating age-appropriate design, and reforming the algorithms that profile and target young users with inappropriate content.
Keeping children safe online requires collective action from governments, tech companies, and regulators to create a safer, more responsible digital environment that prioritises children's safety and well-being over commercial interests.
Multi-pronged approach to keep children safe online
A broader safety net to address the underlying causal factors must include:
1. Default Privacy & Safety Settings: all digital products and services must automatically provide the highest level of privacy and safety settings for users under 18 years, by default.
2. Data: data is the currency for many tech services and products, but we must ban the commercial harvesting / collection / scraping / sale / exchange of children’s data.
3. Privacy: prohibit behavioural, demographic and bio tracking and profiling of children, and therefore no profiled, commercial advertising to children under 18.
4. Recommender Systems (the algorithm): ban the use of recommender systems – software that suggests products, services, content or contacts to users based on their preferences, behaviour, and similar patterns seen in other users.
5. Reporting Mechanisms: require child-friendly, age-appropriate reporting mechanisms and immediate access to expert help on all platforms.
6. Independent Regulation: remove self and co-regulation by the tech industry and establish a fully independent regulator with the power and resources to enforce compliance.
7. Safety by Design: implement age-appropriate and safety-by-design requirements for all digital services accessed by children.
8. Public Data Access: ensure public access to data and analytics for regulatory, academic, and research purposes.
We look forward to reviewing the proposed bill in more detail once it is released and available for public consultation. We will continue advocate for the rights of children and young people to be upheld online, and for optimal safety standards, including protections against bullying, to be built into all digital spaces and devices.
While holding tech companies accountable is essential, age limits alone fail to address the underlying factors that make social media inherently unsafe for children and young people. Dolly's Dream urges the Government and tech platforms to take immediate steps to address the key underlying issues: data harvesting, privacy breaches, harmful recommender systems, and risks of bullying and harassment.
For too long, the burden of protecting children online has unfairly fallen on parents, teachers, and to children themselves. Social media platforms, driven by profit, are designed to capture and retain users’ attention and data, often at the expense of privacy and safety, and can inadvertently become breeding grounds for bullying.
To be effective, age limits require technologies to verify users' ages, which could compromise children's personal information if not carefully regulated. Instead of focusing solely on age restrictions, a broader safety net is needed. This includes enforcing a Children’s Online Privacy Code, mandating age-appropriate design, and reforming the algorithms that profile and target young users with inappropriate content.
Keeping children safe online requires collective action from governments, tech companies, and regulators to create a safer, more responsible digital environment that prioritises children's safety and well-being over commercial interests.
Multi-pronged approach to keep children safe online
- We believe it’s government’s responsibility to ensure children’s digital rights are upheld and realised – setting minimum standards based on community expectations and holding tech companies to account for not meeting these standards.
- It’s tech companies’ responsibility to prevent their services and products from being used in ways that violate children’s rights and which expose children to harms while using their services and platforms.
- And it’s up to the rest of us to take responsibility to upskill and educate ourselves and our children on how to navigate tech and the online world safely and confidently; to participate with them in their digital worlds, not just police them.
A broader safety net to address the underlying causal factors must include:
1. Default Privacy & Safety Settings: all digital products and services must automatically provide the highest level of privacy and safety settings for users under 18 years, by default.
2. Data: data is the currency for many tech services and products, but we must ban the commercial harvesting / collection / scraping / sale / exchange of children’s data.
3. Privacy: prohibit behavioural, demographic and bio tracking and profiling of children, and therefore no profiled, commercial advertising to children under 18.
4. Recommender Systems (the algorithm): ban the use of recommender systems – software that suggests products, services, content or contacts to users based on their preferences, behaviour, and similar patterns seen in other users.
5. Reporting Mechanisms: require child-friendly, age-appropriate reporting mechanisms and immediate access to expert help on all platforms.
6. Independent Regulation: remove self and co-regulation by the tech industry and establish a fully independent regulator with the power and resources to enforce compliance.
7. Safety by Design: implement age-appropriate and safety-by-design requirements for all digital services accessed by children.
8. Public Data Access: ensure public access to data and analytics for regulatory, academic, and research purposes.
We look forward to reviewing the proposed bill in more detail once it is released and available for public consultation. We will continue advocate for the rights of children and young people to be upheld online, and for optimal safety standards, including protections against bullying, to be built into all digital spaces and devices.