11.5 C
Byron Shire
May 24, 2024

Tech giants fail to tackle child sex material 

Latest News

New attractions for Mullum2Bruns Paddle

There’s new free activities to enjoy at Banner Park in Brunswick Heads after this year’s Mullum2Bruns Paddle, which will be held this Sunday, May 26, from 9.30am till 2.30pm.

Other News

Legal system fails to keep kids safe

Fifteen years ago, Karen Bell’s three children – Bon 18 months, Maddie 5, and Jack 7, were found with their dead father Gary Bell in a vehicle at Perico – a collapsed hippy colony on the NSW far south coast.

Wombat burrows provide critical shelter for other species

A new study, published in the Journal of Mammalogy, found wombat burrows help other animals by providing critical shelter for numerous species following severe wildfire, and may even be an important source of water.

Byron Rebels continue strong run in local rugby union

The Byron Shire Rebels put on 83 unanswered points against Lismore Rugby Club last weekend at Shultz Oval, capping...

Spice Palace: a mecca for Middle Eastern dips and spice mixes

Victoria Cosford The business was a natural fit for Bec and Tom, new owners for the past few months, of...


The latest information supplied from Byron Shire’s Water and Recycling on future operational plans contains the term ‘renew’ numerous...

We Like to Tiki: Caper festival closing party this Sunday

North Byron Hotel is the official watering hole of Caper Byron Bay Food & Culture Festival 2024. To celebrate...

Australia’s eSafety Commissioner says they have issued eSafety legal notices to Twitter / ‘X’, Google, TikTok, Twitch and Discord.

Australia’s eSafety Commissioner has released its second report under world-leading transparency powers showing some of the biggest tech companies aren’t living up to their responsibilities to tackle the proliferation of child sexual exploitation, sexual extortion and the livestreaming of child sexual abuse. 

In February this year, eSafety issued legal notices to Twitter (subsequently rebranded as ‘X’), Google, TikTok, Twitch and Discord under Australia’s Online Safety Act. The new transparency powers required the tech companies to answer questions about measures they have in place to deal with the issue. 

Serious shortfalls

The report which summarises their answers, highlights serious shortfalls in how some companies detect, remove and prevent child sexual abuse material and grooming, inconsistencies in how companies deal with this material across their different services and significant variations in the time it takes them to respond to public reports. 

eSafety Commissioner Julie Inman Grant said the proliferation of online child sexual exploitation is a growing problem both in Australia and globally and technology companies have a moral responsibility in protecting children from sexual exploitation and abuse being stored, shared and perpetrated on their services. ‘We really can’t hope to have any accountability from the online industry in tackling this issue without meaningful transparency which is what these notices are designed to surface,’ said Ms Inman Grant.

‘Our first report featuring Apple, Meta, Microsoft, Skype, Snap, WhatsApp and Omegle uncovered serious shortfalls in how these companies were tackling this issue.

They need to do better

‘This latest report also reveals similar gaps in how these five tech companies are dealing with the problem and how they are tackling the rise in sexual extortion and we need them all to do better.

Ms Inman Grant said this about serious crimes being playing out on these platforms committed by predatory adults against innocent children. ‘The community expects every tech company to be taking meaningful action.

‘Importantly, next year we will have industry codes and standards in place which work hand-in-hand with these Basic Online Safety Expectations transparency powers to ensure companies are living up to these responsibilities to protect children.’ 

Non compliance

eSafety also found that two providers, Twitter/X and Google, did not comply with the notices given to them, with both companies failing to adequately respond to a number of questions in their respective notices.

Google has been issued a formal warning, notifying it of its failure to comply due to the company providing a number of generic responses to specific questions and providing aggregated information when asked questions about specific services.

Twitter/X’s non-compliance was found to be more serious with the company failing to provide any response to some questions, leaving some sections entirely blank. In other instances, Twitter/X provided a response that was otherwise incomplete and/or inaccurate. 

Twitter/X did not respond to a number of key questions including the time it takes the platform to respond to reports of child sexual exploitation; the measures it has in place to detect child sexual exploitation in livestreams; and the tools and technologies it uses to detect child sexual exploitation material.

Safety and public policy staff

The company also failed to adequately answer questions relating to the number of safety and public policy staff still employed at Twitter/X following the October 2022 acquisition and subsequent job cuts.

Twitter/X has been issued with an infringement notice for $610,500 and has 28 days to request the withdrawal of the infringement notice or to pay the penalty. If Twitter chooses not to pay the infringement notice, it is open to the Commissioner to take other action. eSafety has also published a statement, called a service provider notification, about the non-compliance by Twitter/X.

Ms Inman Grant said Twitter/X and Google’s non-compliance was disappointing especially as the questions relate to the protection of children and the most egregious forms of online harm.

‘Twitter/X has stated publicly that tackling child sexual exploitation is the number 1 priority for the company, but it can’t just be empty talk, we need to see words backed up with tangible action.

‘If Twitter/X and Google can’t come up with answers to key questions about how they are tackling child sexual exploitation they either don’t want to answer for how it might be perceived publicly or they need better systems to scrutinise their own operations. Both scenarios are concerning to us and suggest they are not living up to their responsibilities and the expectations of the Australian community.’

Some of the key findings in the report featuring
Twitter/X, TikTok, Google, Twitch and Discord include: 

  • While YouTube, TikTok and Twitch are taking steps to detect child sexual exploitation in livestreams, Discord is not, saying it is ‘prohibitively expensive’. Twitter/X did not provide the information required.
  • TikTok and Twitch use language analysis technology to detect CSEA activity such as sexual extortion across all parts of their services whereas Discord does not use any such detection technology at all. Twitter/X uses tools on public content, but not on direct messages. Google uses technology on YouTube, but not on Chat, Gmail, Meet and Messages.
  • Google (with the exception of its search service) and Discord are not blocking links to known child sexual exploitation material, despite the availability of databases from expert organisations like the UK-based Internet Watch Foundation.
  • YouTube, TikTok and Twitch are using technology to detect grooming, whereas Twitter/X, Discord and other Google services are not (Meet, Chat, Gmail, Messages).
  • Google is not using its own technology to detect known child sexual exploitation videos on some of its services – Gmail, Chat, Messages.
  • In the three months after Twitter/Xs change in ownership in October 2022, the proactive detection of child sexual exploitation material fell from 90 per cent to 75 per cent. It said its proactive detection rate had subsequently improved in 2023.
  • For Discord and Twitch, which are partly community-moderated services, professional safety staff are not automatically notified when a volunteer moderator identifies child sexual exploitation and abuse material.
  • Significant variations in median response times to user reports of child sexual exploitation material exist – TikTok says it responds within five minutes for public content, Twitch takes eight minutes, Discord took 13 hours for direct messages, while Twitter/X and Google did not provide the information required.
  • Significant variation in the languages covered by content moderators. Google said it covers at least 71 and TikTok 73. In comparison Twitter said it covered only 12 languages, Twitch reported 24 and Discord report 29. This means that some of the top five non-English languages spoken at home in Australia are not by default covered by Twitter, Discord and Twitch moderators. This is particularly important for harms like grooming or hate speech which require context to identify.

The full report can be found here

Support The Echo

Keeping the community together and the community voice loud and clear is what The Echo is about. More than ever we need your help to keep this voice alive and thriving in the community.

Like all businesses we are struggling to keep food on the table of all our local and hard working journalists, artists, sales, delivery and drudges who keep the news coming out to you both in the newspaper and online. If you can spare a few dollars a week – or maybe more – we would appreciate all the support you are able to give to keep the voice of independent, local journalism alive.



Please enter your comment!
Please enter your name here

Five hours spent at last Thursday’s Byron Council

If you need a fix of local government decision-making, you could dip into the odd five-hour online recording of what occurred at last Thursday’s Council meeting.

Editorial – Optics vs reform

One of the more hyped elements of last week’s federal budget was a $300 handout to every homeowner to alleviate the increase in energy costs.

Final kambo witnesses called, inquest yet to hear from Lore Solaris and Cameron Kite

The Jarrad Antonovich inquest ground inexorably towards its conclusion yesterday, with more evidence from witnesses showing the tragedy could possibly have been avoided, and certainly the ongoing damage lessened, if everyone involved had taken responsibility earlier.

A long story

I see that Israel supporters take exception to the expression ‘from the river to the sea’ as meaning that all Jews should be wiped...