Australian eSafety Commissioner’s Penalty Highlights Steep Consequences for Telegram Over Reporting Delays

Australia’s eSafety Commissioner ordered Telegram to pay AUD 1 million for ignoring transparency obligations. Officials requested details on terrorist and child sexual content steps, but Telegram delayed months, triggering enforcement under the Online Safety Act.

Australian eSafety Commissioner’s Penalty Highlights Steep Consequences for Telegram Over Reporting Delays

Transparency Requirements Under Australia’s Online Safety Act Spark Tough Penalties for Telegram and Tech Platforms Everywhere

Australia’s eSafety Commissioner has handed down a penalty of around AUD 1 million to Telegram after the messaging service failed to meet transparency reporting requirements. These obligations were set under the Online Safety Act, which seeks to ensure that technology platforms take adequate steps to remove or block dangerous content.

The penalty arrives following a series of notices sent out in March 2024, when several platforms were asked to show how they combat terrorist and extremist material, as well as child sexual abuse content.

Telegram, in particular, was instructed to detail the methods it uses to identify and address these harms within its channels.

Missed Deadline and Delayed Response

The Commissioner had initially set 6 May 2024 as the final deadline for Telegram’s response. Instead, the platform provided the requested information more than five months later. This delay, according to the Commissioner, hampered investigations and made it harder to track Telegram’s compliance efforts.

By failing to submit a timely report, Telegram triggered the possibility of a formal infringement notice and, ultimately, a financial penalty. The Commissioner’s official statement reveals that delays of this nature are taken seriously, as authorities rely on transparency data to gauge whether platforms are actively protecting users.

Key Requirements Under the Online Safety Act

Under the Online Safety Act, platforms must outline any preventative steps they have taken to detect harmful or illegal content, such as terrorist propaganda or child sexual exploitation material.

This includes clarifying how they identify live streaming of objectionable material and what procedures they follow to block or remove it. A core reason for these reports is to ensure that apps and services remain accountable when hosting large user communities.

Telegram, known for its encrypted messaging, was expected to provide detailed accounts of its content-moderation methods and to ensure that users producing or sharing illegal material are swiftly dealt with.

Further Consequences at Stake

Telegram now has 28 days to request that the infringement notice be withdrawn, settle the AUD 1 million penalty, or submit a plea for extended payment time.

If the company decides against paying, the eSafety Commissioner may escalate the situation by seeking a civil penalty order in the Federal Court of Australia.

This step, according to Australian regulators, will keep platforms on notice that non-compliance will not be brushed aside. More details on the enforcement action can be found at the Commissioner’s website

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Technology Law.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.