The independent review of Australia’s online safety framework advocates for an approach that mirrors the UK’s Online Safety Act, despite it not yet being fully implemented
Three years on, Australia’s Online Safety Act is in line for a UK-inspired makeover
On 4 February 2024, the Department of Infrastructure, Transport, Regional Development, Communications and the Arts released its report on its review of the Online Safety Act 2021. The independent review conducted by Delia Rickard (former Deputy Chair, Australian Competition and Consumer Commission) started in November 2023, in line with the legal requirement that the operations and the effectiveness of the Online Safety Act be examined within three years of its passage. Citing the examples of the EU, UK and Canada in developing systems-based regulatory frameworks for online safety, the review ultimately made 67 non-binding recommendations to strengthen the law in light of the growing scale of online harms. While stating that the regime’s original aim to “to improve and promote online safety for Australians” was world-leading at its passage first in 2015, the review sets out five new motivating principles around which its recommendations are based:
Promoting human rights and safety;
Promoting and protecting the best interests of the child;
Building an evidence base around online safety and existing and emerging online harms;
Preventing and alleviating online harm present in Australia; and
Improving online safety for all in Australia by advancing service provider responsibility for preventing harms and mitigating the damage done along with user empowerment and transparency.
The recommended changes to the law encompass the types of platforms to be regulated, the categories of harms to be mitigated and the structure and powers of the regulator in enforcing the law, some of which have already been endorsed by the Government.
The Government has already adopted a recommendation to introduce a digital duty of care
The most sweeping change offered by the review is the introduction of a digital duty of care to ground the specific obligations of regulated platforms. Similar to the EU’s Digital Services Act and the UK’s Online Safety Act, the review argues that a duty of care in the Australian law would shift more of the burden of maintaining safety online from the individual user to the platform in requiring due diligence to identify and prevent harms systemically. In addition to raising the expectations of regulated platforms, Rickard also wrote that a duty of care would contribute to greater regulatory coherence internationally, reducing compliance costs and regulatory burden. The Australian Government has already adopted this recommendation, announcing it would legislate for a digital duty of care in November 2024 in response to the publication of the draft of this review.
The review advocates for requiring platforms to mitigate both illegal and legal but harmful content
Within a new duty of care, the review also recommends the definition of specific and enduring categories of harms to be addressed by regulated platforms, including:
Harms to young people;
Harms to people’s mental and physical well-being;
Instruction or promotion of harmful practices;
Threats to national security and social cohesion; and
Other illegal content, conduct and activity.
Again like the Online Safety Act in the UK, the Australian changes appear to capture legal but harmful content, taking a more ambitious approach to addressing a broader scope of harms. The review also recommends that the Australian law adopt a UK-inspired approach to identifying platforms for additional obligations by both their size and their risk profile within a simplified set of service categories. Designated platforms would be required to follow a refined and slightly expanded set of obligations including risk assessments, compliance audits, transparency reporting, data sharing and complaints handling.
The eSafety Commissioner would be reformed into an independent commission with vastly expanded fining powers
In order to better implement and enforce these expanded duties, Rickard also advocates for strengthening Australia’s online safety regulator, the eSafety Commissioner, in both structure and power. The report recommends that the Commissioner be reformed into the Online Safety Commission composed of a Chair, Deputy Chair and Commissioner, with flexibility to grow up to nine members in the future. The Online Safety Commission would be separated from the Australian Communications and Media Authority (ACMA), which currently supports the eSafety Commissioner as an independent statutory office. That growth would be supported by a cost recovery mechanism, such as a levy, through which industry would fund sectoral regulation. A new Commission would also be empowered to levy fines more than 60 times greater than the current regime allows for. The report recommends that the maximum fine for noncompliance be raised from A$782,500 (£395,380) to five percent of a firm’s global turnover or A$50m (£25m), a vast expansion in enforcement tools already mirrored in the under-16s social media ban passed last year. Read in its entirety, the statutory review of Australia’s Online Safety Act advocated for an approach most similar to the UK’s Online Safety Act, suggesting a “Westminster effect” could be beginning even before the efficacy of the UK’s law is fully tested.