Australias Lesson from Global Struggles to Control Kids Internet Access

Debate continues to rage in Australia over whether children should (or can) be banned from social media. Following politicians' recent promises to ban those under 16 from the platforms, eSafety Commissioner Julie Inman Grant has raised concerns that imposing age restrictions could push children to use social media in secret and limit their access to critical social supports.

Author

  • Lisa M. Given

    Professor of Information Sciences & Director, Social Change Enabling Impact Platform, RMIT University

A recent analysis in the United Kingdom found a social media ban "would solve nothing", citing evidence from an 18-year study across 168 countries that showed "no causal relationship" between internet access and young people's wellbeing.

The Australian federal government is committed to trial age assurance technology to restrict children's access. For now, it's unclear what tech solutions currently exist that could effectively restrict access by age.

Other countries have tried, and mostly failed, to ban children from accessing online content for decades. Australia would be wise to heed the lessons learned from these experiences.

What has the United States tried?

The Children's Online Privacy Protection Rule (COPPA) was introduced in the United States in 1998. It continues to influence how children - globally - access information online.

COPPA imposes several requirements on "operators of websites or online services" who gather personal information from children under 13. This includes the need to obtain parental consent.

To comply with this law, many companies (including social media platforms) imposed bans on children under 13 from accessing online services.

However, these bans have been heavily criticised for contributing to age fraud online. They also limit children's rights to access information and rights to self-expression, as protected under the United Nations Convention on the Rights of the Child.

Another wide-reaching attempt to restrict children's access to "obscene or harmful content over the internet" was introduced in the United States in 2000.

The Children's Internet Protection Act (CIPA) required schools and libraries to control the content children could access online. This was typically achieved using internet filters which blocked searching for particular words.

However, these blunt instruments often blocked useful information. A blocked search for the word "breast" to limit access to pornographic content could also block information on breast cancer, for example.

Over many years, research has shown internet filtering is ineffective at shielding children from bad experiences online.

Unsuccessful age bans

Many other countries have imposed bans on children's access to online content, with varying degrees of success.

South Korea imposed a "shutdown law" in 2011. It was designed to address online gaming addiction by limiting those under 16 from accessing gaming sites after midnight.

However, many children used accounts in their parents' names to continue accessing gaming sites. The law also faced legal challenges, with parents concerned about restrictions on their rights to parent and educate their children. The law was abolished in 2021.

In 2015, the European Union introduced legislation that would ban children under 16 from accessing online services (including social media) without parental consent.

The proposed legislation was controversial. There was a significant outcry from technology companies and human rights organisations. They claimed the rules would violate children's rights to expression and access to information.

The law was amended to allow individual countries to opt out of the new age ban, with the United Kingdom opting to keep limits only for those under age 13. This patchwork approach meant individual countries could set their own limits.

In 2023, for example, France enacted a law requiring social media platforms to restrict access for teens under 15 unless authorised by a parent or guardian.

Today, Europe leads the world in imposing significant online protections for children, with huge implications for tech companies.

In 2023 a new Digital Services Act was introduced, which forbids platforms like Instagram, Facebook, TikTok and Snapchat from targeting children with personalised advertisements.

Rather than banning children from online services, this legislation focuses on controlling how very large platforms engage with children. It's meant to ensure protections are in place to manage harmful content and algorithmic influences on platform use.

What can Australia learn from these global attempts?

A critical message over the last two decades is that bans are not effective. While technological interventions (like filtering and age assurance technologies) continue to improve, there are many workarounds (such as using others' accounts) that make it impossible to ban children outright.

One effective approach has focused on protecting children's personal data. This has led to long-standing requirements for companies to comply with restrictions. India and Brazil have recently introduced similar data-focused protections for children.

However, for older children, significant restrictions can conflict with UN protections for children's rights. Australia must carefully balance potential conflicts when attempting to limit or ban children's online access.

Even if Australia did impose a ban for children under 16, it would be unlikely to reshape global approaches to such bans.

The US and EU are large markets, with significant influence on the actions of technology companies. As with COPPA's influence on limiting social media access for children under 13 globally, it's likely that American and European policy innovations will continue to play a primary role in shaping global approaches.

Australia should lead by aligning its approach to these international endeavours to bolster appropriate protections for young children. At the same time, we should help parents educate older children about appropriate social media use.

This strikes an appropriate balance between protecting children's rights to access information and express themselves, while ensuring guardrails are in place to do so safely.

The Conversation

Lisa M. Given receives funding from the Australian Research Council. She is a Fellow of the Academy of the Social Sciences in Australia and of the Association for Information Science and Technology.

/Courtesy of The Conversation. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).