In the Internet Age: Content Moderation is a Serious Business

"Social Media Icons Color Splash Montage - Landscape" by Blogtrepreneur is licensed under CC BY 2.0

As the number of users and the volume of content continues to expand, the issues associated with content moderation will become increasingly severe, and policy-related regulations will become increasingly strict. With the proliferation of information, the complexity of content control is becoming increasingly challenging. Free speech and psychological trauma caused by the manual review are two critical challenges associated with content review. Therefore, whether the government should play a more pivotal role in implementing social media content auditing deserves more attention and discussion.

Challenge1: Free speech

“free speech ฟ้องปิดปาก ปิดปาก SLAAP” by Prachatai is licensed under CC BY 2.0

The existence of content moderation has caused widespread controversy in society and is considered one of the most pressing challenges to freedom of expression (Gillespie, 2018). Someone argues that content moderation will offend the rights of freedom of speech. Undeniably, due to the Internet’s free speech model, Internet users can express themselves anonymously, which helps promote an effective and more positive exchange of information within the online community. Likewise, anonymity is implied by the protection of privacy. In contrast, traditional speech-restricting media require citizens to be compelled to display their true identities, and thus some online citizens choose to forgo exercising free speech (Gillespie, 2018).

“Rebecca MacKinnon” by New America is licensed under CC BY 2.0

Besides, Rebecca Mackinnon (2014) notes that real-name policies and strict regulation of online content censorship can pose an unavoidable burden for transgender individuals and political dissidents. The anonymity of Internet platforms can bring some Internet users’ identity and venting, as they can leave behind legal and moral responsibilities and better engage in self-expression in online communities by hiding their identities on the Internet (John, 2019).

However, whether this freedom of speech in anonymity will offend other people’s legal rights and mental health deserves more attention. There are specific adverse effects of digital platforms due to the differences in the quality of Internet users and the ever-expanding rights of freedom of expression (John, 2019). The victims may suffer from cyber-bullying even lose their lives. The N. Korea incident & telegram incident can be a notable example. The culprit set up many online channels on the encrypted instant messaging app Telegram and publicly posted sexual information he received from threatening women on every channel, including going on public display live streaming. Although the Internet is a platform where people have the freedom to express themselves, the anonymity of the Internet allows users with illegal ideas to take advantage of it, damaging the healthy atmosphere of the Internet. Therefore, companies have the responsibility to adopt a real-name system on their platforms to protect the legal rights of each user (Gillespie, 2018). Therefore, free speech should be restricted in legal content, and content regulation should be implemented reasonably.

Challenge2: Psychological Trauma caused by manual review 

Low cost 3D scanning workshop with Jan Boehm and Mona Hess is licensed” by Institute of Making is licensed under CC BY 2.0

There is also a significant threat to viewers’ entertainment consumption from the unfiltered diversity of user-generated content. Viewers are frequently exposed to a wide range of materials, from mildly abusive and insulting material to hate speech and extremely violent and unsettling material for their mental health and well-being. User-generated content must be scrutinized in this situation to see if it promotes healthy consumption or distributes harmful content (Gillespie, 2018).

Social media and conventional media can now be screened for potentially dangerous content using a growing number of human content reviewers as well as the newest technological advancements. Nevertheless, content auditing has a severe negative impact on content moderators’ emotional well-being. They are regularly exposed to disturbing, upsetting, and sexually explicit material, increasing their chance of developing post-traumatic stress disorder (PTSD) or suffering from mental exhaustion. Bad things happen all the time in their jobs, and moderators are not equipped to deal with the stress and trauma of their jobs sensibly. A Microsoft spokesperson downplayed the content review job as a ‘nasty job'” (Roberts, 2019, p. 39). Chloe, a full-time Facebook content moderator. She was overwhelmed when she reviewed a video depicting a scene in which a man was murdered despite spending long hours training to fend off the onslaught of hate speech, violent attacks, and other related content. It is the responsibility of companies to offer enough support for content moderators to deal with the psychological effects of their work” (Dwoskin, 2019). Therefore, psychological trauma triggered by manual check-in content review is an enormous challenge.

Government’s role for content moderation

Governments should play a more significant role in moderating content on media platforms for several reasons. First, content moderation is a sustainable strategy representing the role of the regulation maker, law interpreter, arbiter, dispute adjudicator, and enforcer of any regulation they decide to enact (Gillespie, 2018, p. 5). The Internet is not a place outside the law. In response to the emergence of illegal content such as obscenity, pornography, violence, and gore on the Internet, the government should pay attention to the power of industry organizations, social organizations, and the public in Internet governance and form a flexible governance model and mechanism of collaborative governance by multiple interests. While being wary of government intervention in citizen control, Internet governance needs to pay more attention to industry self-regulation (Roberts, 2019).

Secondly, tracking individual users and manual review is costly to the government and platforms provoked by mobility and anonymity. The restricted regulations and legal require media platforms to take more responsibilities and set the bottom line in content. More transparent and open processes of platforms regulations need to be conducted by the government, and related departments and institutions should take obligations to enhance legal and cultural roles (Amélie & Stephan, 2021).

The principal purpose for the government is to create an equal and inclusive value-sharing environment for online users and transmit correct values. However, it is practically difficult and costly for governments to trace content providers responsible for their words and what they do in media platforms (John, 2019). Although there are sparse explicit laws and regulations on media platforms, platforms have a responsibility to act as intermediaries and organize public discourse. Moreover, some political topics and social order should be regulated, particularly some personal attacks and hate speech on specific groups post by some terrorists. Facebook founder Mark Zucker berg states that Facebook will cooperate with governments and departments to make content regulation with some sensitive and terrorist topics (Richard, 2018). As online users number increases, governments have undertaken different measures to hinder some radicalized speech on media platforms. Furthermore, media platforms will comply with the foreign government’s requests to remove some sensitive and offensive content (Amélie & Stephan, 2021). Providing an excellent online environment is more likely to create value for the government and society and combating illegal and criminal behavior is even more of a priority.

In the media industry, content moderation is a serious business that has always existed. However, nowadays, conventional media’s institutional structure does not apply to contemporary media communications. With the big-time change, the question left to the government is how to keep the platform from slipping to the “dark side” or even to make it brighter, which creates a healthy online environment and maybe the point of all governments to invest a lot of money and efforts in content moderation. The government should focus on finding a balance between freedom of expression and content control and grasp a reasonable scale. Moreover, to address the psychological trauma of manual moderation. Governments should focus on industry self-regulation and harness the public’s power for content control so that they can more actively engage with it and address the problems it creates.

Reference

Amélie, H., & Stephan, D. (2021). Competent Third Parties and Content Moderation on Platforms: Potentials of Independent Decision-Making Bodies From A Governance Structure Perspective. Journal of Information Policy (University Park, Pa.), 266–300. Retrieved from https://doi.org/10.5325/jinfopoli.11.2021.0266

Dwoskin, E., Whalen, J., & Cabato, R. (2019). In the Philippines, content moderators at YouTube, Facebook and Twitter see the worst of the web and suffer silently. The Washington Post.Retrieved July 25, 2019, from https://www.washingtonpost.com/technology/2019/07/25/social-media-companies-are-outsourcing-their-dirty-work philippines-generation-workers-is-paying-price/

Gillespie, T. (2018). All Platforms Moderate. In 1528395568 1099272964 T. Gillespie (Author), Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media (pp. 1-13). New Haven, CT: Yale University Press.

Giovanni de, G. (2019). Free speech in the age of online content moderation“, Völkerrechtsblog, 26 November 2019, doi: 10.17176/20191126-121907-0.https://slate.com/technology/2014/07/google-plus-finally-ditches-its-ineffective-dangerous-real-name-policy.html            

John, S. (2019). Why the Government Should Not Regulate Content Moderation of Social Media. Policy Analysis.

Massanari, A. (2016). #Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society, 19(3), 329–346. https://doi.org 10.1177/1461444815608807

Newton, C. (2019, February 25). The trauma floor. Retrieved October 15, 2021, from https://www.theverge.com/2019/2/25/18229714/cognizant-facebook- content-moderator-interviews-trauma-working-conditions-arizona

Richar, A. (2018). Hard Questions: Where Do We Draw the Line on Free Expression?. Facebook Newsroom.

Roberts, S. (2019). Understanding Commercial Content Moderation. In 15283982291099274818 S. T. Roberts (Author), Behind the screen: Content moderation in the shadows of social media (pp. 33-72). New Haven, CT, CT: Yale University Press.

Content moderation by Wantong Zhang is licensed under a Creative Commons Attribution 4.0 International License.

About Wantong Zhang 1 Article
This is Wantong Zhang, and second year in Sydney Uni. And my major is Digital Culture as well as Industrial Relations and Human Resources Management. In my spare time, I love traveling and taking photos.