Complications of Government in Content Moderation

Group 04

The Australian High Court ruled last month that media publishers on Facebook are now liable for any defamatory comments on their posts by third-party users. This places the greater responsibility on the publisher of the post to moderate their comment sections (Rolph, 2021).

However, Prime Minister Scott Morrison believes that this responsibility should be on the platform itself and has indicated that the federal government will be looking to regulate Facebook and other social platform’s content moderation (Borys, 2021). Government involvement in content moderation could have a range of impacts on the process, from impacting international users’ access to content, to biased moderation.

Content moderation can be defined as “the screening, evaluation, categorization, approval or removal/hiding of online content according to relevant communications and publishing policies” (Flew, et al. 2019). The biggest issue for moderators is ensuring content is lawful, non-offensive and free from harassment, while also balancing users’ freedom of speech and advertiser’s interests. When enforcing content rules, moderators rely on traditional concerns around media and public culture, including sex, violence, and obscenity (Gillespie, 2018), but when these platforms are used for a multitude of purposes, the edges of these traditional cultural rules become blurred.

A recent example of the difficult decisions involved in content moderation is the footage of George Floyd’s murder in 2020. Generally, footage as graphic as this would be taken down as it features violence, murder and is emotionally distressing. However, the video was the centre of a global protest and movement for racial equality after it was widely shared online. The footage remains available online, but has content warnings attached, with YouTube going as far as age-restricting it (Lerman, 2020).

However, content moderation isn’t always as successful as this example. An example of content moderation failure is the 2019 Christchurch Mosque shootings, which were not only streamed live on Facebook, but footage of which remained online for days after the attack. Consequently, the incident left New Zealand companies considering whether they should pull their advertising from platforms that were involved in the dissemination of the video (Murrell, 2019).

A major issue that has arisen recently for content moderators is the spread of misinformation, specifically surrounding Covid-19. With 33% of Australians accessing their news through social media (ACCC, 2019), this issue is incredibly impactful. A lot of the misinformation about Covid-19 spread online is disseminated from the major platforms including Facebook and YouTube, with many lawmakers suggesting these sites take responsibility for it (Paul, 2019).

As a response, Facebook has written a Covid-19 and Vaccine specific Policy, which prohibits false information on the basis that it contributes to physical and real-world harm. Facebook has reportedly taken down 20 million pieces of Coronavirus misinformation, as well as 3000 accounts, pages, and groups, including Australian MP Craig Kelly’s page (Karp & Taylor, 2021).
In the past, Facebook has taken a more relaxed approach to misinformation to protect user’s freedom of speech, such as their political advertising in the 2016 election (Paul, 2019). This change in direction from Facebook may be attributed to concerns from advertisers, as well as external pressure from governing bodies (Abril, 2019).

While this move is viewed by lawmakers as a positive one, many other users see it as a move towards censorship and a limitation on the voice of users (Niemiec, 2020). After having his page removed from Facebook for repeatedly sharing Covid-19 misinformation, Craig Kelly took aim at the platform and claimed they were censoring him, comparing it to book burning (Karp & Taylor 2021). Evidently, content moderation is crucial part of the digital platform ecosystem and has far reaching effects on our lives, however it is complicated due to a range of influencing factors.

This brings us to the issue of government intervention in content moderation. As previously mentioned, the Prime Minister is looking to introduce government regulations for content moderation on social media, specifically citing harassment as his focus (Borys, 2021). While government intervention would help these platforms to ensure content is legal by outlining universal guidelines, content moderation is a complicated process that is influenced by a range of factors and risks.

In 2019, the ACCC published their Digital Platforms Inquiry, which examined “the implications and consequences of the business models of digital platforms for competition, consumers, and society” (2019). As part of the inquiry, it explored the potential impacts digital platforms could have on consumers accessing misleading and/or harmful news stories, which as outlined earlier, is a growing issue. Consequently, it recommended that “digital platforms establish an industry code to govern the handling of complaints about disinformation” (2019).

However, allowing the government to define and govern ‘misinformation’ without an independent body could potentially lead to misuse of power. For example, under the current Parliament definition, satire and parody are labelled as a form of misinformation (Commonwealth of Australia, 2018). Within government-based content moderation, a politician could conceivably have parody content taken down as a form of misinformation, harassment or even defamation, such as John Barilaro’s current defamation case against YouTube satirist Friendlyjordies (McKinnell, 2021). Whilst this is an extreme possible future, it is important to consider the range of potential outcomes of government intervention.

One of the challenges of regulating digital platforms is that they have a global reach, and therefore regulations against Australian users could impact users internationally as well (Gillespie, 2018). This was the case in February when Facebook temporarily banned news content in Australia, which also meant no users across the platform were able to share news from Australian sites (Hurst, et al, 2021). While these sites were eventually restored, it meant that Australians living internationally were left with restricted news from home. If the government is to go ahead with their plan for increasing their role in content regulation, it must be understood that these regulations have the potential to impact the way users outside of Australia access content.

Content moderation is a part of our digital life that we often overlook, however it plays an immensely important role in how we access and interact with content. There are arguments both for and against further government regulations of this moderation, but if the government is to go ahead with their plans for putting these regulations in place it must first consider all of the factors that define such a complex process.

Bibliography
Abril, D. (2021). Facebook just put the final nail in Mark Zuckerberg’s free speech master plan. Retrieved 15 October 2021, from https://fortune.com/2021/06/04/facebook-free-speech-politicians-policy-newsworthiness-hate-speech-misinformation/

“Mark Zuckerberg” by jdlasica is licensed under CC BY 2.0

ACCC. (2019). Digital platforms inquiry – final report. Canberra: Commonwealth of Australia. Retrieved from https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report
Borys, S. (2021). Social media a ‘coward’s palace’, says Prime Minister, as he promises more action to hold online abusers responsible. ABC News. Retrieved from https://www.abc.net.au/news/2021-10-07/prime-minister-defends-dutton-twitter-defamation-action/100522002
COVID-19 policy updates and protections | Facebook Help Centre. (2021). Retrieved 15 October 2021, from https://www.facebook.com/help/230764881494641/?helpref=uf_share
Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal Of Digital Media & Policy, 10(1), 33-50. doi: 10.1386/jdmp.10.1.33_1
Hurst, D., Taylor, J., & Meade, A. (2021). Facebook reverses Australia news ban after government makes media code amendments. The Guardian. Retrieved from https://www.theguardian.com/media/2021/feb/23/facebook-reverses-australia-news-ban-after-government-makes-media-code-amendments
Gillespie, T. (2018). CHAPTER 1. All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (pp. 1-23). New Haven: Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029-001
Lerman, R. (2020). YouTube is starting to age-restrict news videos of George Floyd’s killing. The Washinton Post. Retrieved from https://www.washingtonpost.com/technology/2020/06/04/youtube-restricts-floyd-videos/
McEvoy, J. (2021). Facebook Says It Has Removed 20 Million Pieces Of Covid Misinformation—But Sees Signs Vaccine Hesitancy Is Declining. Retrieved 15 October 2021, from https://www.forbes.com/sites/jemimamcevoy/2021/08/18/facebook-says-it-has-removed-20-million-pieces-of-covid-misinformation-but-sees-signs-vaccine-hesitancy-is-declining/?sh=167cec97707c
McKinnell, J. (2021). Two defence arguments rejected in John Barilaro’s defamation case against Friendlyjordies. ABC News. Retrieved from https://www.abc.net.au/news/2021-08-13/judge-rejects-arguments-in-friendly-jordies-defamation-case/100374936
Murrell, C. (2021). The Christchurch shooting was streamed live, but think twice about watching it. ABC News. Retrieved from https://www.abc.net.au/news/2019-03-15/christchurch-shooting-live-stream-think-twice-about-watching-it/10907258
Niemiec, E. (2020). COVID ‐19 and misinformation. EMBO Reports, 21(11). doi: 10.15252/embr.202051420
Parliament of Australia. (2018). Report on the conduct of the 2016 federal election and matters related thereto. Canberra: Parliament of Australia.
Paul, K. (2019). Ocasio-Cortez stumps Zuckerberg with questions on far right and Cambridge Analytica. The Guardian. Retrieved from https://www.theguardian.com/technology/2019/oct/23/mark-zuckerberg-alexandria-ocasio-cortez-facebook-cambridge-analytica
Paul, K. (2021). ‘A systemic failure’: vaccine misinformation remains rampant on Facebook, experts say. The Guardian. Retrieved from https://www.theguardian.com/technology/2021/jul/21/facebook-misinformation-vaccines
Rolph, D. (2021). High Court rules media are liable for Facebook comments on their stories. Here’s what that means for your favourite Facebook pages. Retrieved 15 October 2021, from https://theconversation.com/high-court-rules-media-are-liable-for-facebook-comments-on-their-stories-heres-what-that-means-for-your-favourite-facebook-pages-167435
Taylor, J., & Karp, P. (2021). MP Craig Kelly ‘absolutely outraged’ after Facebook removes his page for misinformation. The Guardian. Retrieved from https://www.theguardian.com/australia-news/2021/apr/26/mp-craig-kelly-absolutely-outraged-after-facebook-removes-his-page-for-misinformation