Saturday, November 9, 2024

Australia’s eSafety commissioner waters down child abuse detection rules in online safety standards

Must read

The Australian online safety regulator has watered down new rules to force tech companies to detect child abuse and terror content on encrypted messaging and cloud storage services, after some of the biggest tech firms in the world warned it could lead to mass government surveillance.

In November, the eSafety commissioner announced draft standards that would require the operators of cloud and messaging services to detect and remove known child abuse and pro-terror material “where technically feasible”, as well as disrupt and deter new material of the same nature.

It did not specify how the companies would need to comply technically, but in an associated discussion paper, the office said it “does not advocate building in weaknesses or back doors to undermine privacy and security on end-to-end encrypted services”.

However, because this was not explicitly defined in the standards, tech companies and privacy advocates raised concern it would not protect end-to-end encryption. Apple warned it would leave the communications of everyone who uses the services vulnerable to mass surveillance. The tech giant said it was concerned that “technical feasibility” would be limited to whether it was financially viable for companies to implement – not if it would break encryption.

But in the finalised online safety standards lodged in parliament on Friday, the documents specifically state that companies will not be required to break encryption and will not be required to undertake measures not technically feasible or reasonably practical.

That includes instances where it would require the provider to “implement or build a systemic weakness or systemic vulnerability in to the service” and “in relation to an end-to-end encrypted service – implement or build a new decryption capability into the service, or render methods of encryption used in the service less effective”.

What’s behind the fight between Elon Musk’s X and Australia’s eSafety commissioner? – video

When companies rely on these exceptions, the standards will require companies to “take appropriate alternative action” and eSafety can require the companies to provide information about what those alternatives are.

“We understand different services may require different interventions but the clear message of these standards is that no company or service can simply absolve themselves from responsibility for clear and tangible actions in combatting child sexual abuse and pro-terror material on their services,” the eSafety commissioner, Julie Inman Grant, said in a statement.

Despite the compromise in the final standards, in an opinion piece published in The Australian ahead of the release of the standards on Friday, Inman Grant hit back at the criticism of the proposals, saying tech companies had claimed the standards “represented a step too far, potentially unleashing a dystopian future of widespread government surveillance”.

The real dystopian future, she said, would be one where “adults fail to protect children from vile forms of torture and sexual abuse, then allow their trauma to be freely shared with predators on a global scale”.

“That is the world we live in today.”

The backdown is a win for the tech firms that provide end-to-end encrypted messaging services, including Apple, Proton and Signal, which had all raised concerns about the proposal.

Proton had threatened to challenge the standards in court if they went ahead.

Encrypted messaging company Signal this week complained to the European Union over similar proposal to force tech companies to do “upload moderation” to detect content being shared on encrypted communications prior to those communications being encrypted.

The company’s president, Meredith Whittaker, told Guardian Australia no matter how regulators framed it, it was just another way of calling for mass scanning of private communications.

“We have been consistently trying to clarify the technical reality and the stakes of the proposals that are being put forward,” she said, adding that they “put lipstick on a mass surveillance proposal and say that it isn’t actually undermining privacy”.

“What we’re talking about is a kind of self-negating paradox. You cannot do mass surveillance privately, full stop.”

The standards will go into effect six months after a 15-day disallowance period in parliament.

Latest article