Requiring Big Tech to enforce its own safety standards is among the key recommendations of a report by an Australian Government Select Committee on Social Media and Online Safety into online harms on social media platforms.
In its unanimous report, the Committee found that the safety of people online is being threatened by individuals who engage in harmful behaviour and conduct. The harms experienced by victims of online abuse leave a long trail of trauma and suffering, as expressed by many witnesses in evidence to the Committee.
Committee Chair Lucy Wicks MP said, the evidence received by the Committee highlighted the necessity of a three-part response: social media platforms focusing on user safety and enforcing their policies; the Government appropriately regulating and monitoring the sector; and users understanding that while respectful dissent and disagreement is a part of online discourse, abuse isn’t, and it should not be tolerated.
“The recommendations in this report are an important next step in making our online world and social media platforms safer for all,” Ms Wicks said.
“The Australian Government is leading the world in online safety, but technology and online predators evolve quickly, so the Government must continue to hold social media companies to account and support victims of abuse.
“For too long social media platforms have been able to ‘set the rules’, enabling the proliferation of online abuse. The balance of responsibility for the safety of users online, which until recently has been primarily on users, must be ‘flipped’ to ensure that social media platforms bear more of the burden of providing safety for their users.
“To protect Australians, social media companies have to take responsibility to enforce their terms of service, prevent recidivism of bad actors, prevent pile-ons or volumetric attacks, prevent harms across multiple platforms and be more transparent about their use of algorithms.
“The inquiry has also focused on what more can be done to address individual actions and behaviours online by building on the eSafety Commissioner’s existing education programs and government awareness campaigns to give Australians, and especially children, more information about how to safely engage in online discourse.’
Key recommendations include:
The establishment of a Digital Safety Review to review all online safety legislation and government programs, with a view to simplify regulations into one framework and make recommendations to the Australian Government on potential proposals for mandating platform transparency;
Requesting that the eSafety Commissioner examine the extent to which social media companies enforce their policies in relation to users experiencing harm, in addition to requiring them to report to Government regarding reducing harm caused by their algorithms;
Addressing technology-facilitated abuse in the context of family and domestic violence, including the recommendation of significant additional Australian Government funding for support services;
Mandating that all social media companies set as a default the highest privacy settings for people under the age of 18 years; and
Increasing the reach of educational programs geared at both adults and young people regarding online harms, with a focus on the eSafety Commissioner’s powers to remove harmful content and the mechanisms through which victims can report harmful content and online abuse.
Meanwhile, outgoing Australian Competition and Consumer Commission (ACCC) Chair Rod Sims has called the development of new upfront rules that force dominant digital platforms to treat their users fairly is the important next step in reforming Australia’s consumer protection laws.
The next phase of consumer law reforms to be debated should relate to digital platforms with market power, Mr Sims said.
“Digital platforms have business models that seek to exploit all the data they have on you. We need laws to prevent the misuse of this data, either by preventing so called ‘dark patterns’ that get you to act against your best interests, or requiring steps to prevent scams, or allowing appropriate dispute resolution,” Mr Sims said.
Mr Sims said it remained important for regulators to be vocal advocates for the rights of consumers and to identify opportunities for law reforms when needed.
“There are some who believe that regulators should only enforce the law as it is and not suggest publicly and need for law change. What a loss to the public debate this suggests,” Mr Sims said.
Visit https://www.aph.gov.au/Parliamentary_Business/Committees/House/Social_Media_and_Online_Safety