Content moderation is an inherent characteristic of digital platforms, and the platforming of the Internet has made content moderation increasingly important. Gillespie (2018, p. 5) explains the role of platforms that, “whether they want to or not, platforms find that they must serve as setters of norms, interpreters of laws, arbiters of taste, adjudicators of disputes, and enforcers of whatever rules they choose to establish”. However, platforms serve as important roles in Internet governance and content moderation, are they doing a good job in that role? This article will analyze three aspects of the issues arise for digital platforms with content moderation, the controversies they generate, and the role of government organizations in content moderation.
What issues arise for digital platforms with content moderation
In simple terms, content moderation is the review of content that users upload, post, or share on platforms, including text, images, audio, and video.However, content moderation is not an easy task.The challenge for platforms is exactly when, how, and why to intervene (Gillespie, 2018, p. 5), so platforms also want to avoid the appearance of content moderation.
When
Figure 2. “time” by Sean MacEntee is licensed with CC BY 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by/2.0/
The platform should first decide when to choose to conduct content moderation, which means that content moderation will be conducted by content moderators after users have published content, or content moderation will be conducted after users have submitted applications for publishing content, and the content will be published on the platform only after the moderation is approved.
The former method is suitable for digital platforms that require high timeliness of published content, such as user comments on information or music platforms, where users publish comments more out of improvisation, so from the user’s point of view, they hope their published comments can be seen immediately as successful and effective. The second kind of review time is more applicable to users posting articles, opinions or discussions in communities, forums or news platforms. Such as TikTok and YouTube, for these platforms need to be responsible for the content and build authority, so they usually have to review before publication.
How
Figure 3. “working in free space” by iklash/is licensed with CC BY-NC 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by-nc/2.0/
Most platforms start with a machine moderation process, where the machine automatically detects content containing restricted or sensitive words, which has the advantage of batch processing information and speeding up content review time. As Roberts (2019, p. 36) states, the vast majority of social media content uploaded by its users requires human intervention for it to be appropriately screened-particularly where video or images are involved. Therefore content review is difficult to fully automate, which means that some of the content will be flagged as something that needs to be re-reviewed by content moderators because of user complaints about that content.
It is worth noting that the content moderation profession is a hard-hit area for panic attacks and other psychological problems. Most Internet companies have rules that prohibit moderators from divulging the content of their work. In other words, they need to digest the uncomfortable negative content alone. For most moderators, viewing horrific content such as violence and gore is a routine, and those uncomfortable images often even appear without warning, delivering a huge visual and emotional impact. As a result of reading conspiracy theories and other views over time, many moderators have become numb and undiscerning.
In addition, these tasks are demanded of workers who are free-quently relatively low-status and low-wage in relation to others in the tech industry (Roberts, 2019, p. 39). Therefore many moderators who work as outsourced contract workers are not paid at a level commensurate with their high-pressure job content, and they often do not enjoy the same salary and benefit security as regular employees.
The controversies
Content moderation is an essential part for information-based or content-based platforms, such as Facebook, YouTube, Twitter, and Google. Its main purpose is to remove restricted or illegal content, in order to enhance the user experience of the platform and avoid copyright infringement or public controversy. However, there is an inevitable and long-standing controversy in the attempts to implement content moderation. The debate is usually about freedom of speech and expression, and the attempts at content censorship are in some way limiting or sacrificing some of that freedom.
Freedom of speech is a state in which the coercion of others on one’s speech is minimized as much as possible in society. The Declaration of the Right of Man and the Citizen (France, 2012) clearly states that the free communication of ideas and opinions is one of the most precious of the rights However, where the boundaries of freedom of expression lie is a debate of long standing. Without content moderators, hateful, violent, gory, and pornographic content would, perhaps, flood the platforms. Platforms that lack content moderation provide fertile ground for these kinds of toxic spaces to emerge (Massanari, 2016, p. 331). Therefore, the question of whether a balance can be found between content moderation and freedom of expression is an important one, and it is a debate that has persisted since the implementation of content moderation attempts.
The role of government organizations in content moderation
The government has given Internet companies and digital platforms considerable space for self-regulation. As Mark Zuckerberg said, “the real question, as the Internet becomes more important in people’s lives, is what is the right regulation, not whether there should be or not” (Watson, 2018). The article has already mentioned that there are many problems and controversies with completely autonomous governance by the platform, but a completely government-led regulatory model is not the only correct solution either.
As Google’s CEO says, “the government runs three times slower than normal business … so what you want to do is you want to make sure that the government does not get in the way and slow things down” (Cunningham, 2011). The main argument of those who oppose government regulation is that the government lacks the ability to regulate technology in the public interest. In addition, they argue that government regulation of the Internet is too strict, which will limit the diversity of the Internet’s development. This also relates to the core concept of the early Internet, which was freedom, and the need for the Internet to develop outside of government regulation.
The multi-stakeholder governance model proposed by the Internet Society (2014) is a model of Internet governance that lacks the involvement of governmental organizations in content review. This governance model allocates power to different organizations so that non-government organizations have the same or even more rights than government organizations. In comparison, the concept of the Four Internets (O’Hara & Hall, 2018) may be more applicable to the current Internet governance, namely the Silicon Valley open internet, the bourgeois internet, the authoritarian internet, and the commercial Internet. This may be the best option for Internet governance at present, with different countries or regions choosing Internet governance models that fit their respective political philosophies or social ideologies.
Conclusion
Content moderation is essential for digital platforms. From the current attempts of content moderation, it can be found that the biggest problems digital platforms mainly face are how to protect the psychological health of content moderator and how to balance free speech and content moderation. An Internet governance model that completely lacks government involvement in content moderation is flawed, and governments should have a role in limiting content moderation on social media. The best approach at present may be for each country or region to make its own choices based on its own circumstances.
References
Cunningham, L. (2011, October 1). Google’s Eric Schmidt expounds on his Senate testimony. The Washington Post. Retrieved October 14, 2021, from https://www.washingtonpost.com/national/on-leadership/googles-eric-schmidt-expounds-on-his-senate-testimony/2011/09/30/gIQAPyVgCL_story.html.
France. (2012). Declaration of the rights of man and of the citizen. A New Dictionary of the French Revolution. https://doi.org/10.5040/9780755622771.ch-0121
Gillespie, T. (2018). All Platforms Moderate. In Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media (pp. 1–23). essay, Yale University Press.
“INTERNET” by lecasio is licensed with CC BY-NC-ND 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by-nc-nd/2.0/
Massanari, A. (2016). #Gamergate and the fappening: How reddit’s algorithm, Governance, and Culture Support Toxic Technocultures. New Media & Society, 19(3), 329–346. https://doi.org/10.1177/1461444815608807
O’Hara, K., & Hall, W. (2018, December 7). Four internets: The geopolitics of digital governance. Centre for International Governance Innovation. Retrieved October 14, 2021, from https://www.cigionline.org/publications/four-internets-geopolitics-digital-governance/.
Roberts, S. T. (2019). Understanding Commercial Content Moderation. In Behind the screen: Content moderation in the shadows of social media (pp. 33–72). essay, Yale University Press.
“time” by Sean MacEntee is licensed with CC BY 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by/2.0/
Watson, C. (2018, April 11). The key moments from Mark Zuckerberg’s testimony to Congress. The Guardian. Retrieved October 14, 2021, from https://www.theguardian.com/technology/2018/apr/11/mark-zuckerbergs-testimony-to-congress-the-key-moments.
Who makes the internet work: The internet ecosystem. Internet Society. (2020, September 21). Retrieved October 14, 2021, from https://www.internetsociety.org/internet/who-makes-it-work/.
“working in free space” by iklash/ is licensed with CC BY-NC 2.0. To view a copy of this license, visit https://creativecommons.org/licenses/by-nc/2.0/