BigTech CompaniesBusinessNewswireTechnology

Misinformation Code: Fact or Fiction?

▼ Summary

– Digi has launched a public consultation to review the Australian Code of Practice on Disinformation and Misinformation following criticism of its effectiveness.
– A Reset Australia report from May 2024 concluded the code has completely failed to mitigate misinformation and disinformation in Australia.
– The report identified systemic failures in platforms’ content moderation, advertising approval systems, and transparency measures.
– Digi’s review will consider recommendations from ACMA’s third report but does not mention the Reset Australia findings.
– Key areas for the review include improving transparency reporting, complaints handling, and the code’s scope and governance structure.

A major industry review is now underway for Australia’s primary framework designed to combat false information online. The Digital Industry Group (Digi), an association representing major technology firms, has initiated a public consultation to reassess the Australian Code of Practice on Disinformation and Misinformation (ACPDM). This move comes amidst a starkly critical report asserting the code has utterly failed to protect Australians from the spread of falsehoods.

Digi’s membership includes some of the world’s most influential digital platforms, such as Apple, Google, Meta, Microsoft, TikTok, and X (formerly Twitter). The organization first introduced the voluntary code in 2021. According to Digi’s managing director Sunita Bose, the framework has served as a cornerstone for the nation’s policy response to the intricate challenges posed by misinformation and disinformation.

However, a contrasting and highly critical perspective emerged from a Reset Australia report published in May 2024, titled “Functioning or failing? An evaluation of the efficacy of the Australian Code of Practice on Disinformation and Misinformation.” This assessment delivered a far less optimistic verdict, declaring the current strategy a “complete failure” in its mission to mitigate harmful false information within the country.

The Reset Australia review, documented by Australian Policy Online, pointed to several systemic shortcomings. It found that the content moderation and advertising approval systems used by platforms consistently failed to reduce the risks associated with misinformation. Furthermore, the report highlighted significant problems with oversight and transparency. It noted “strong discrepancies” between the claims platforms made in their transparency reports and the results of independent testing. The existing complaints process was also deemed inadequate, unable to properly resolve the issues raised by users.

In its newly released discussion paper, Digi outlines the key areas it will examine during the code’s review. The organization states it will consider recommendations from the Australian Communications and Media Authority’s (ACMA) third report on digital platforms. Notably, Digi’s paper does not reference the damning Reset Australia report.

The consultation will focus on several potential improvements. These include re-evaluating the code’s overall scope and enhancing the transparency reporting process to give the public more substantial and useful information. Digi is also considering how the code can promote a wider “eco-system approach,” involving entities beyond just the major tech platforms. Other areas for potential reform involve the public complaints handling process and the role and membership of the code’s administrative sub-committee.

The Reset Australia report concluded with a powerful indictment of the status quo. It asserted that platform systems routinely break their promises on curbing misinformation, that transparency measures are misleading, and that the complaints process is ineffective at securing necessary corrections. Taken together, these failures represent a total breakdown of the current governance model for tackling false information in Australia, prompting calls for a fundamental overhaul of the entire system.

(Source: ITWire Australia)

Topics

misinformation control 95% disinformation mitigation 95% code review 90% platform accountability 88% content moderation 85% transparency reporting 85% systemic failings 85% governance failure 85% Digital Platforms 80% public consultation 80%