Call for Digital Duty to Curb Child Abuse on Aussie Screens

IJM Australia

eSafety's transparency report published yesterday reveals the gaps in Telegram and Reddit's detection and deterrence of child sexual exploitation and abuse on their services.

Failures to detect new child sexual exploitation and abuse images and videos should urgently be addressed to stop the circulation of this illegal material and protect the children depicted who may be at immediate risk of harm.

A legislated digital duty of care could require tech companies to ensure measures are in place to mitigate new child sexual abuse material being created or shared on their platforms, and to reported on and improve these measures if they do not adequately address this risk.

Telegram and Reddit provided responses to the eSafety Commissioner Julie Inman Grant's questions about how they respond to the circulation of new images and videos of child sexual abuse material on their services, over the April 2023 – February 2024 reporting period.

Whilst Reddit reported it is using text classifiers to detect new images and videos, which may represent children who are presently in abusive situations, the eSafety Commissioner noted that its lack of use of tools like nudity detection and age estimation for new images and videos "may mean key indicators of CSEA are missed".

IJM affirms that Reddit should do more to detect CSEA in new images and videos by employing on-market AI tools such as nudity detection and age estimation to detect and deter this content from being circulated and protect children from abuse.

Reddit stated that once CSEA is detected, the content is blocked/removed and the account is permanently banned; an enforcement ticket is created and prioritised for human review; and depending on the outcome of human review, the company may make a report to the National Center for Missing and Exploited Children and take further enforcement action, including account sanctions.

Following eSafety issuing Telegram a fine of over $950,000 for failing to respond to the reporting deadline by over five months, the company provided the requested information in response to eSafety's transparency notice.

Telegram stated it uses an internal hash matching system to detect known CSEA images, except on Chats and Secret Chats, and is in the process of joining the Internet Watch Foundation's safety programs to gain access to its hash lists.

In response to why hash matching tools were not used on Chats or Secret Chats user reports, Telegram stated that Telegram was "founded on the principle of defending user privacy and their right to private communication" and that "this commitment prioritizes user privacy above all". Telegram stated that because of this commitment to user privacy, encrypted contents of private chats are always protected, ensuring that the confidentiality of private correspondence is never compromised.

Telegram stated that whilst it uses internal AI and machine learning models to detect new CSEA images and videos in public communications, it does not use it in Chats, Secret Chats, Private Group Chats or Private Channels.

eSafety noted that not using proactive detection tools to identify and review potential CSEA material increases the likelihood that such material will remain undetected and continue to circulate on these parts of the service.

This could place children who are in situations of abuse at further risk by creating a safe haven for perpetrators to share images and videos of child sexual exploitation and abuse without fear of detection.

Telegram stated it relies on alternative signals to assess and prioritise reports made about material in end-to-end encrypted parts of the service. eSafety noted that this may limit Telegram's ability to review, assess, prioritise, and respond to reports about harmful and illegal material or activity occurring in Telegram's Secret Chats, like child sexual abuse material, for which it receives user reports.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).