The Incomplete Source of Content Moderation

The Incomplete Source of Content Moderation

Abstract

Content moderation is a sustainable strategy and represents setters of regulation, interpreters of laws, the role of arbiters, adjudicators of controversies, and enforcers of whatever regulations they determine to constitute. (Gillespie, 2018, p. 5) Like all civilized systems, content moderation requires refinement and improvement in the face of many challenges and multiple developments and practices. This is a necessary trend in the development of the web. Furthermore, governments should play a more crucial role in performing content moderation regulation not only on social media but involve in the broader web field. Content moderation operates in the absence of a fair and transparent system, which exacerbates existing social problems, and this, in turn, has a destabilizing impact on the economic, political, and civilizational development of globalization.

Content censorship is a profession that has never been absent from the news media industry, and the structural framework of traditional media is no longer applicable to contemporary media communication. These social platform companies implement content moderation as the requirements of update continuously to maintain interest in it and defeat obsolescence. (Roberts, 2019, pp.57)

 Facebook Fiasco in 2018

 (AP Photo/Marcio Jose Sanchez), Facebook CEO Mark Zuckerberg makes the keynote address at FB Facebook’s developer conference, May 1, 2018.
https://www.cigionline.org/articles/social-medias-self-regulation-isnt-enough/

In July 2018 Facebook endured a history-making share price plunge as investors perceived the huge amount of cost involved in maintaining a safe and responsive online public square. It all also had to stem from Mark Zuckerberg posting an announcement on Facebook, his 2018 New Year’s resolution, the content of which also promised to tackle online hate. (Elghawaby, 2018) This Facebook fiasco has raised concerns on several fronts, not only from investors but also from legislators who have been questioning the self-regulation of other social platforms using Facebook as an example, but more so from users who use social media worldwide. The security of private information and the online environment is a hot topic of debate among users, and this concern is well-founded. Crime on online platforms has not been eliminated by the implementation of content moderation, indeed there have been numerous incidents of privacy exposure, personal attacks, and other such incidents.

Mark Zuckerberg’s statement raises an issue that he is unable to fix and which is a thorn in the side of users. Otherwise, social platforms are used to acting as a medium because they are distinct from content contributors and publishers. However, as intermediaries social platforms are involved in huge amounts of user data and capital integration, making it impossible to ignore how social platforms behave in the online community. In summary, according to the current state of the contemporary online environment, content moderation implemented by social media alone is unmanageable and unimprovable, since the source of these debates and contradictions is the polygonal element. Therefore, it is essential that multiple parties, including governments, the public, and regulators, are substantially involved in the practice of content moderation in social media and other online services.

Internal Contradiction of Content Moderation

Illustrations by Corey Brickley, 2019

The implement of content moderation in multiple digital platforms is carried full of overload challenges. The implement of content moderation mainly relies on two-part in short, algorithms technology and content moderators. As matter of fact, the scale and volume of materials and content generated by users converging into the unwieldy and intractable pile. Compare to the technology being implemented is nowhere near the level of technology and algorithms required for Content moderation. ( Roberts, 2019, pp. 34). In this case, machine-automated filtering is the capacity to undertake the whole work. Therefore, content moderators play an essential in censoring and regulating user-generated content on social media is to guarantee compliance with the laws and regulations governing the operation of its platforms as well as to contribute positively to the maintenance of the community of users who voluntarily upload and view content on its sites. To a certain extent, the implementation of content moderation is predicated on algorithms and professional content moderators.

Exacerbating Social Class in Inequalities 

https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

According to an article titled “The secret lives of a Facebook moderator in America”, some employees experienced being a moderator as a source of trepidation to the extreme that they had to bring a gun to work every day, while others moderators suffer from severe effects on mental health and develop PTSD-like symptoms. (Newton, 2019)

Social media and the internet have formed an invisible and airtight empire of commercial capital. The contemporary situation of the content moderators is a striking reflection of the exploitative nature of capitalism. First of all, focusing on the salaries of content moderators from an objective point of view, the annual income is $28,800 per year for the moderators in Phoenix, however, the average salary for Facebook staff has $240,000 comparatively. As matter of fact, a large proportion of online content screening is “in-house” in nature, which is understood in terms of an employment relationship as “temporary” or “contract” workers. Having been brought up to the situation that in-house employees are not full-time staff of the company. Creep into the actual practice content of the moderator. There is no obligation for the company to continue the employment beyond the end of the contract. ( Roberts, 2019, pp. 44). To upshot, the absence of government intervention to enforce public regulation of social media platforms exacerbates the severe socio-economic fault lines, as the absence of transparency in the rules gradually undermines the basic protections and rights of employees at the bottom.

Disappointing phenomenon presenting significant inequalities in the content moderator work contents and requirements and salary packages. In detail, content moderators usually have a requirement to sign nondisclosure on the detail of their work, Social platforms take into account that competitions might secrete the exact nature of content moderation and at the same time, this is a cunning way of operating to evade regulation and public scrutiny by users, relevant regulators, civil society organizations on content moderation’s policies on each social platforms( Roberts, 2019, pp. 38). Moreover, in the field of content moderation, there is a great giant of staff turnover because the unemployment rate is very high, for example, employees are not protected by regular employment law thus caused the situation to be likely to be dismissed for a minor error. (Newton, 2019)In addition, serious staff restructuring exacerbates the mental strain on employees, especially as content moderators are faced with extreme and newsworthy content created by users daily. The current treatment of content moderators by social media companies is grossly unconscionable and is certainly an exercise in immediate short-term gain.

Consider for long-term strategies

To upshot, most contemporary social media platforms practice content moderation in a self-regulatory manner, having been brought up to the issue of content moderators being treated in a differentiated manner as well as, the phenomenon that content moderators working in a harsh content and environmental environment is an internal conflict that social media platforms are enforcing. Objectively speaking, the implementation of a policy requires the intervention and regulation of multiple agencies, where more than just the government and the platform as independent entities need to contribute. Moreover, this includes cooperative regulation by multiple agencies, such as non-government organizations (NGO)/civil society governance as well as monitoring, etc. The coordinated participation of multiple parties facilitates the implementation of content moderation under the supervision and control of several stakeholders, thus increasing the transparency and fairness of the process.

Total words: 1210

Author: Keer Tan

Tut group: 06( Ayesha)

Reference page

King, F. (2019). What is Techlash and what does it mean for the digital industry? Retrieved October 15, 2021, from https://www.balticapprenticeships.com/blog/what-is-techlash-and-what-does-it-mean-for-the-digital-industry

Newton, C. (2019, February 25). The trauma floor. Retrieved October 15, 2021, from https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

Newton, C. (2019, June 19). Bodies in seats. Retrieved October 15, 2021, from https://www.theverge.com/2019/6/19/18681845/facebook-moderator-interviews-video-trauma-ptsd-cognizant-tampa

Elghawaby, A. (2018, August 02). Social Media’s self-regulation isn’t enough. Retrieved October 15, 2021, from https://www.cigionline.org/articles/social-medias-self-regulation-isnt-enough/

Gillespie, T. (2018). All Platforms Moderate. In 1528395568 1099272964 T. Gillespie (Author), Custodians of the internet: Platforms, content moderation, and the hidden decisions that shape social media (pp. 1-13). New Haven, CT: Yale University Press.

Roberts, S. (2019). Understanding Commercial Content Moderation. In 1528398229 1099274818 S. T. Roberts (Author), Behind the screen: Content moderation in the shadows of social media (pp. 33-72). New Haven, CT, CT: Yale University Press.

About Teresa Tan 1 Article
This is Keer Tan, and second year in Sydney Uni. And my major is Digital Culture as well as Film Studies. In my spare time, I am amateur photographer because camera has enriched my life.