Statement On Apple User Reporting Safety Measure

This statement is attributable to eSafety Commissioner Julie Inman Grant:

eSafety welcomes Apple's introduction of a new feature allowing Australian children to easily report unwanted nude images directly to the company, nearly two years after eSafety's transparency reporting first highlighted the lack of such basic safety measures on its platforms.

Apple began the roll-out today of a new in-app reporting feature available by default to children in Australia initially. Australian adults will be able to opt-in to the feature. Availability in other parts of the world is expected to follow.

eSafety has repeatedly called for clear and accessible user reporting measures since our first Basic Online Safety Expectations (BOSE) report in December 2022 found Apple had no in-service reporting mechanism, so it is no coincidence Apple is introducing this feature in Australia before rolling it out worldwide.

Fundamental to Safety by Design and the Basic Online Safety Expectations (BOSE) are easily discoverable ways to report harmful content and abuse.

Direct reporting, combined with Apple's existing feature which detects images and videos that contain nudity sent through messages, AirDrop, FaceTime or other in-phone apps, is a positive initiative that will help protect children from receiving unsolicited nudes using Apple devices.

This is particularly important as we continue to see the targeting of Australian children for sexual extortion or grooming on a range of services, through a range of approaches. It can only take one report to ensure an offender is banned, and significant ongoing and future harm to multiple children prevented.

Where appropriate, these user reports can also be referred to law enforcement, providing them with vital information to apprehend the offenders perpetrating these crimes.

While we welcome this new feature, we continue to call for Apple to broaden its approach, including by introducing measures that help further protect children and all users from the full range of online harms, including terrorist content, technology-facilitated abuse and re-traumatisation through the hosting and sharing of child sexual exploitation material. Other services without these key safety features must follow-suit.

This is a positive safety innovation and as Apple continues to test its efficacy, we look forward to learning how such tools can contribute to greater safety outcomes across the industry. We also commend Apple for making this Communications Safety tool available to developers.

eSafety has long encouraged Safety by Design whereby we urge all technology companies to build safety protections into their products and services, preventing online harms from happening in the first place. Our aim is to 'lift all boats'.

Today, more than ever, the Australian community expects technology companies to take all reasonable steps to prevent their products and services being used to store, share and distribute horrific content like child sexual abuse and terrorist material.

Sunlight is the best disinfectant and eSafety will continue to use its transparency powers via the BOSE to raise safety standards and make tech companies more accountable for harms playing out on their platforms and products.

This will work hand in hand with phase 1 mandatory industry codes and standards, which require providers of online products and services in Australia to do more to address the risk of harmful material, including child sexual exploitation material.

With the industry standards for messaging services, dating services, file sharing, and other websites coming into force in December this year, relevant services should be working towards implementation and compliance now.

Phase 2 codes drafted by industry and currently open for public comment will provide protections for children against pornography and other age-inappropriate content.

Background

Apple has previously made its Communications Safety feature available to help protect children by detecting images or videos containing nudity. The feature detects images and videos that contain nudity children might receive or attempt to send in Messages, AirDrop, FaceTime messages, Contact Posters in the Phone app, and Photos.

Similarly, it has introduced Sensitive Content Warning which allows adults to opt-in to blurring images and video that contain nudity. The analysis happens entirely on-device, to protect privacy.

When a user sees a Communication Safety or Sensitive Content Warning about an image or video containing nudity, they will now have the option to report to Apple alongside the existing options, which include blocking a sender, messaging someone they trust for help, and viewing resources to get help and support online.

If the user chooses to report to Apple, the device will prepare a report that includes the images or videos that were determined to contain nudity, as well as messages sent immediately before and after the image or video. The content included in the report is designed to help Apple take action as the situation warrants.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.