New online safety laws come into force (safety commissioner)
Australia’s eSafety Commissioner has welcomed the commencement of the Online Safety Act, which provides additional protections for Australians in the fight against online harms.
“The Online Safety Act has now come into force and makes Australia’s existing laws for online safety more expansive and much stronger,” said eSafety Commissioner Julie Inman Grant.
“These new laws cement eSafety’s role as a world leader in online safety. They place Australia at the international forefront in the fight against online abuse and harm – providing additional protections for Australians in the fight against online harms through our approach of prevention, protection, and proactive change in the online space.”
For the past year, eSafety has been planning for the commencement of the Online Safety Act, issuing a series of regulatory guidance pieces to prepare industry and other stakeholders for how eSafety will be implementing the new legislation.
Now the Online Safety Act has commenced, eSafety can receive reports which fall under the new legislation through our website.
Every situation is unique and every matter reported to eSafety will be considered on a case-by-case basis. eSafety will be able to offer support, information and advice, even in situations where we are unable to take regulatory action under the new laws.
The new laws:
A new Adult Cyber Abuse Scheme for Australian adults:
- eSafety will be able to act as a safety net to give Australian adults who have been subjected to serious online harm somewhere to turn if the online service providers have failed to act in removing the abusive content.
- If a platform fails to take action, people can visit the Report section of the eSafety website to make a report. Our new investigative and information gathering powers will allow us to investigate and assess complaints and decide what action we can take.
- If the material is not removed, eSafety can impose civil penalties (including fines) on those who posted it, or the provider of the service where it appears, if they do not comply with a notice from eSafety to remove the material.
- The bar for determining what ‘adult cyber abuse’ is has been set deliberately high, to ensure it does not stifle freedom of speech.
- Under the law, to reach the threshold the abuse must be both ‘intended to cause serious harm’, and ‘menacing, harassing or offensive in all the circumstances’. Serious harm could include material which sets out realistic threats, places people in real danger, is excessively malicious or is unrelenting.
- If a matter does not meet the threshold, we will still be able to offer support, information and advice.
A stronger Cyberbullying Scheme for Australian children:
- eSafety’s existing Cyberbullying Scheme will be bolstered to enable eSafety to be able to order online service providers to remove material not just from social media sites, but from online services where a lot of children spend their time – including online game chats, websites, and direct messaging platforms..
- If eSafety seeks removal of content, the online service provider has 24 hours to respond, down from the previous 48 hours (this may be longer in certain circumstances).
An updated Image Based Abuse Scheme
- Online service providers will now have half the time – cut from 48 hours to 24 hours – to take down intimate images (including videos) after getting a removal notice from eSafety.
- The Act also gives eSafety new powers to expose repeated failures to deal with image-based abuse. For example, eSafety will be able to name and shame online service providers that allow publication of intimate images without consent of the person shown on two or more occasions in a 12-month period, and are in breach of their own terms of service.
Targeted blocking power
- New Abhorrent Violent Conduct powers allow eSafety to direct internet service providers to block access to certain material that promotes, incites, instructs in or depicts abhorrent violent conduct, such as rape, torture, murder, attempted murder and terrorist acts, and is likely to cause significant harm to the Australian community.
- This will allow eSafety to respond to online crisis events, like the Christchurch terrorist massacre, by requesting or requiring internet service providers block access to such extreme violent content.
Illegal and restricted online content
- Under updates to Australia’s Online Content Scheme, online service providers who fail to comply with eSafety removal notices to take down illegal and restricted online content that is accessible to Australians – such as child sexual exploitation material – face financial penalties of up to $111,000 per offence for individuals and $555,000 for corporations.
- Those services may also have their content delinked from search engines and their apps removed from app stores if they fail to comply.
- As a last resort, where a service is deemed to pose a serious threat to the safety of Australians, eSafety may also apply for a Federal Court order that the provider of a particular social media service, relevant electronic service, designated internet service, or internet carriage service stop providing that service in Australia.
Additionally, the Online Safety Act stipulates what the Australian Government now expects from online service providers. It raises the bar by establishing a wide-ranging set of Basic Online Safety Expectations, including that online service providers take reasonable steps to ensure that users are able to use the service in a safer manner.
They will also encourage the tech industry to be more transparent about their safety features, policies and practices. The Minister for Communications, Urban Infrastructure, Cities and the Arts will establish the Basic Online Safety Expectations through a legislative instrument called a determination. eSafety will then have the power to require online service providers to report on how they are meeting any or all of the Basic Online Safety Expectations.