ESafety Budget Estimates Opening Statement

Julie Inman Grant

eSafety Commissioner

The internet can be a wonderful but sometimes dangerous place. A place where our children can create, connect and explore but also where they may be targeted by predators or bullies and exposed to horrendous abuse or graphic violence through social media.

This includes gratuitous and high impact footage of people being blown to pieces in wars or terror attacks, shootings and stabbings, brutal fights designed to degrade and multiple forms of sexual violence.

Study after study has shown that exposure to violent content makes children feel unsafe, disturbed, frightened, saddened and shocked. They are seeing this content in primary school and feel guilty for watching. Some struggle to forget what they've seen for many years, while others see the violence becoming an increasingly normalised part of their daily online experiences.

Medical and mental health professionals indicate this leads to children becoming socially withdrawn and desensitised to violent content and, in some cases, adopting violent behaviours of their own.

Both research - and experience - demonstrate that vulnerable children - including those from disadvantaged backgrounds, those with pre-existing mental health issues, trauma victims, and neuro-divergent children - are most at-risk to this online manipulation.

Exposure is negatively impacting how our children view the world and their expectations for the future.

In the past six weeks alone - from Sydney to Perth - we have seen how this online brutality is spilling over into real-world violence and harm.

eSafety's purpose is to help safeguard Australians from these kinds of harms and promote safer, more positive online experiences. Our legislated role is to balance the needs of communities and industry through coordination, education and regulation. We help Australians maximise the significant benefits of online engagement, while minimising the risks.

Numbers across all our reporting schemes have surged as Australians increasingly look to us for exactly this kind of help. We are listening to their concerns, and providing the protections they want and need.

In the year to May, complaints to us about cyberbullying of children were 311 per cent higher than the same period four years ago. Complaints about image-based abuse were 242 per cent higher.

Reports of illegal and restricted content were 111 per cent higher - 33,000 URLs in total were investigated by eSafety, the vast majority concerning child sexual abuse material. As with violent and extremist content, every instance of child sexual abuse material was hosted overseas.

It's important to remember that behind each report is a real human story: a person who needs help and who may be in intense pain - very often a child.

That is why we measure our success through the lives we positively impact, not in the number of reports and notices we issue.

There is no question this is a difficult, complex and contestable space. But,it is important to clarify that we are not arbiters of speech online nor are we proactively monitoring the internet. Parliament empowered eSafety to investigate and remediate online harms reported to us by distressed Australians.

And, we have a high level of successful remediation in cases of cyberbullying, not only removing the content designed to ostracise or humiliate the child but working directly with parents and schools. By providing wrap-around support, we recorded 3,700 click throughs to Kids Helpline last year.

In a real sense, by providing relief from serious and targeted online abuse, some parents told us that we have literally saved their children's lives when they felt hopeless or had nowhere else to turn.

Compassionate citizen service is a key aim, but we are also using our systems and process powers to ensure greater degrees of transparency and accountability from industry.

Six of the eight industry codes around illegal content are now in place setting important and leading global precedent. Soon, mandatory standards will be tabled before Parliament.

We first published regulatory guidance on Basic Online Safety Expectations, or BOSE, in July 2022, with the Minister affirming and expanding the BOSE determination today, strengthening these significantly at a time when tech companies are becoming more opaque.

We have now issued 19 transparency notices, covering 30 major online services, all domiciled overseas. Legal compulsion to reveal what these companies are - or are not doing - to keep Australians safer online is critical to holding them transparent and accountable, and for driving meaningful change.

Rigorous, evidence-based research and resources are also critical to eSafety's mission. These help Aussie kids better explore the internet without fear, empower parents to better guide their children and to arm educators with the digital literacy tools they need.

In the past year, more than 2.5 million unique visitors engaged with our online resources, a 57% increase on the previous 12 months. We've reached another 3 and a half million Australians through our webinars, BeConnected program and online safety education providers.

eSafety works with a broad range of partners. Just today, we announced a new protocol with the Electoral Council of Australia and New Zealand to help ensure Australian election workers are protected from threats and harassment while performing their vital duty upholding our most basic democratic right: the right to vote.

To this end, it's important to note that there is a dark thread of suppression that drives online hostility, threats and intimidation which is designed to chill speech, undermine democratic debate and silence the target. This is what we fight against every day through our adult cyber abuse scheme and social media self-defence training.

Indeed, as more Australians turn to us for support, we must continue to elevate safety standards across the platforms they use every day. This means anticipating new risks and providing specific guidance for platforms to pivot from "dangerous by design" to "safety by design."

This is particularly important as we have seen generative AI create more vectors for synthetic child sexual abuse material and deepfake image-based abuse, and as our children start to wander into the high-sensory, hyper-realistic worlds of the metaverse.

The platforms that are monetising Australians' online lives and personal data need to employ the same safety standards and safeguards we expect of physical goods like toys, cars, medicines and food.

Our future national well-being and safety depend upon it.

We thank you for your support and I'd be happy to answer any questions you may have about our work.

*Please check against delivery.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.