Australia’s e-safety commissioner issues legal notices to Apple, Meta and Microsoft

Australian authorities have issued leading online platforms with legal documents ordering them to disclose how they are addressing the proliferation of child sexual exploitation material.

Australia’s e-safety commission has issued Apple, Facebook and WhatsApp’s parent company Meta, Microsoft, Snap and Omegle with legal notices under the Australian government’s new basic online safety expectations.

To continue reading this ABC News report, go to:
abc.net.au/news/2022-08-30/e-safety-commission-legal-notices-apple-meta-microsoft/101385294

Also see:

I’m putting tech giants on notice: protect our kids by Julie Inman Grant, Federal eSafety commissioner
Australians’ reliance on online services to manage their daily activities has grown significantly over the past decade, particularly during the global pandemic.

From how we work to how we socialise and connect with friends and loved ones, what we can do with technology in 2022 is astounding.

While embracing a shift to a more connected online world brings many positives, we must also be alive to the harms that are all too common on online platforms.
smh.com.au/national/i-m-putting-tech-giants-on-notice-protect-our-kids-20220829-p5bdmx.html

Australia demands Apple, Meta, Microsoft share anti-abuse steps, threatens fines
An Australian regulator sent legal letters to Facebook owner Meta Platforms, Apple Inc and Microsoft Corp demanding they share their strategies for stamping out child abuse material on their platforms or face fines.

The e-Safety Commissioner, a body set up to protect internet users, said it used laws which took effect in January to compel the technology giants to disclose measures they were taking to detect and remove abuse material within 28 days. If they did not, the companies would each face a fine of A$555,000 ($383,000) per day.
reuters.com/technology/australia-demands-apple-meta-microsoft-share-anti-abuse-steps-threatens-fines-2022-08-30/

Tech platforms asked to explain how they are tackling online child sexual exploitation [news release]
Australia’s eSafety Commissioner has issued legal notices to some of the biggest tech companies in the world, requiring them to report on the measures they are taking to tackle the proliferation of child sexual exploitation material on their platforms and services.

The notices have been issued to Apple, Meta (and WhatsApp), Microsoft (and Skype), Snap and Omegle under the Australian Government’s new Basic Online Safety Expectations, a key part of the Online Safety Act 2021.

The Expectations set out the minimum safety requirements expected of tech companies who wish to operate in Australia and the steps they should take to protect Australian users from harm.

“The Basic Online Safety Expectations are a world-leading tool designed to encourage fundamental online safety practices and drive transparency and accountability from tech companies. They will help us ‘lift the hood’ on what companies are doing – and are not doing – to protect their users from harm,” eSafety Commissioner Julie Inman Grant said.

“Some of the most harmful material online today involves the sexual exploitation of children and, frighteningly, this activity is no longer confined to hidden corners of the dark web but is prevalent on the mainstream platforms we and our children use every day.

“As more companies move towards encrypted messaging services and deploy features like livestreaming, the fear is that this horrific material will spread unchecked on these platforms. Child sexual exploitation material that is reported now is just the tip of the iceberg – online child sexual abuse that isn’t being detected and remediated continues to be a huge concern.”

The decision to issue a notice is an information gathering process and may reflect a range of factors, including the number of complaints that eSafety has received, the reach of a service, or whether limited information is available on a company’s safety actions or interventions on their services.

Key safety risks for child sexual exploitation and abuse include the ability for adults to contact children on a platform, as well as features such as livestreaming, anonymity, and end-to-end encryption. This has lent itself to range of proliferating harms against children including online grooming, sexual extortion and coerced, self-produced child sexual exploitation material.

eSafety plans to issue further notices to additional providers in due course to build a comprehensive picture of online safety measures across a wide range of services.

Companies who do not respond to notices within 28 days could face financial penalties of up to $555,000 a day.

The spread of child sexual exploitation material online is a global scourge; last year 29.1 million reports were made to the National Centre for Missing and Exploited Children.

Ms Inman Grant said eSafety has handled more than 61,000 complaints about illegal and restricted content since 2015, with the majority involving child sexual exploitation material. 

“We have seen a surge in reports about this horrific material since the start of the pandemic, as technology was weaponised to abuse children. The harm experienced by victim-survivors is perpetuated when platforms and services fail to detect and remove the content,” she said.

“We know there are proven tools available to stop this horrific material being identified and recirculated, but many tech companies publish insufficient information about where or how these tools operate, and too often claim that certain safety measures are not technically feasible.

“Industry must be upfront on the steps they are taking, so that we can get the full picture of online harms occurring and collectively focus on the real challenges before all of us. We all have a responsibility to keep children free from online exploitation and abuse.”

The Expectations will work hand in hand with new mandatory codes being developed by the online industry with the assistance of eSafety. The codes will require providers of online products and services in Australia to do more to address the risk of harmful material, including child sexual exploitation material, on their services.

The next step will be for industry to commence public consultation on the draft mandatory codes and eSafety encourages all stakeholders including consumers and other stakeholders to provide their views. eSafety will continue to work closely with industry, offering any support, feedback and expertise we can.

“Our expanded powers under the Online Safety Act, combined with mandatory industry codes and the Basic Online Safety Expectations, create an umbrella of online protections for Australians of all ages,” Ms Inman Grant said.

Additional information is available at: esafety.gov.au/industry/basic-online-safety-expectations.
esafety.gov.au/newsroom/media-releases/tech-platforms-asked-explain-how-they-are-tackling-online-child-sexual-exploitation

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.