Australia’s online safety watchdog, eSafety Commissioner Julie Inman‑Grant, has publicly criticised major technology companies — including Apple, google, Meta and Microsoft — for not doing enough to prevent child sexual exploitation and abuse (CSEA) on their platforms and services. This issue came into sharper focus amid new transparency reporting requirements for tech firms under Australia’s Online Safety laws.
The watchdog’s remarks highlight serious safety gaps in how tech giants are detecting and combating child sexual abuse content — particularly newly created or live‑streamed material — despite having technological capability and resources to intervene effectively.
Key Failures Identified by the eSafety Report
In its latest transparency report, eSafety outlined several major shortcomings in the tech companies’ efforts to protect children online:
1. Inadequate Detection of Live Abuse
Tech platforms are not consistently using tools to detect child sexual abuse occurring in real time, particularly during live video calls or livestreams. For example, some services do not deploy proactive detection tools at all — even when the technology exists.
2. Insufficient Tools Across Services
Even where detection tools are available, companies often apply them only to certain parts of their services while leaving gaps elsewhere. For instance:
Meta detects abuse on some services but not on messaging platforms
Google uses detection on YouTube but not on google Meet or Chat
Some messaging and collaboration apps lack any proactive detection at all.
3. Lack of language Analysis for Grooming and Extortion
Platforms have been slow to incorporate language‑analysis tools that could help identify grooming, sexual extortion, or other harmful conversations — even after these tools were shared with companies as common indicators of abuse.
Why This Matters — Rising Harm and Public Concern
Australia’s concerns are grounded in real‑world data showing the prevalence and persistence of online child sexual exploitation:
The Australian Centre to Counter Child Exploitation received nearly 83,000 reports of child sexual abuse material during the 2024–25 year — a 41 % increase from the previous period.
These reports span a range of harms including grooming, sexual extortion, livestreamed abuse and distribution of illegal content.
Child protection advocates argue that without stronger safeguards and enforcement, dangerous materials and abusive behaviours will continue to spread freely on popular wallet PLATFORM' target='_blank' title='digital-Latest Updates, Photos, Videos are a click away, CLICK NOW'>digital platforms.
Regulatory Measures and Calls for Accountability
Transparency Notices and Reporting
Under Australia’s Online Safety Act, large tech companies must submit detailed reports every six months describing how they deal with child sexual abuse material, grooming, extortion and AI‑generated harmful content. Companies that fail to comply can face significant fines for non‑response or incomplete reporting.
Calls for Stronger Laws
ESafety and child safety organisations are urging the government to move beyond reporting and implement a “digital duty of care” — legal requirements forcing tech companies to embed child protection and risk mitigation into product design and operations, rather than relying on voluntary measures.
Industry Response and Progress
While the report acknowledges some improvements — such as better response times for known content and the use of safety features like blurred sensitive images — the overall picture remains concerning. Australia’s regulator stresses that the failure to fully protect children reflects lack of industry will, not lack of technical capability.
Context: Broader Child Safety Efforts in Australia
This scrutiny of tech platforms comes against the backdrop of other child protection initiatives in Australia, including:
A nationwide ban on social media use by children under 16, with millions of accounts deactivated under the new law.
These combined efforts reflect Australia’s attempts to tighten online safety and hold tech firms accountable for harms occurring on their services.
The Bottom Line
Australia’s eSafety Commissioner and child protection advocates argue that despite the laws and reporting requirements now in place, Big Tech is still falling short in its responsibility to prevent child sexual exploitation and abuse online. They call for stronger enforcement, better technology deployment, and legal duties to make wallet PLATFORM' target='_blank' title='digital-Latest Updates, Photos, Videos are a click away, CLICK NOW'>digital platforms inherently safer for children.
Disclaimer:
The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy or position of any agency, organization, employer, or company. All information provided is for general informational purposes only. While every effort has been made to ensure accuracy, we make no representations or warranties of any kind, express or implied, about the completeness, reliability, or suitability of the information contained herein. Readers are advised to verify facts and seek professional advice where necessary. Any reliance placed on such information is strictly at the reader’s own risk.
click and follow Indiaherald WhatsApp channel