The Significant Challenges of Content Moderation

Content moderation is essential when it comes to digital platforms. Without it these digital spaces would become forums intwined with hate speech, violent and confronting content, and inappropriate material that would consequently make them largely different environments. It is undeniable that content moderation is essential for digital platforms to succeed economically and gain currency and recognition on a broader scale. With their significance in mind, the debates that have come to front about content moderation have then centred on questions over the extent to which it should be exercised and the specific processes of its execution. Issues surrounding these debates include the challenge of unifying norms and values, the embedded features of bias, and the inconsistencies within current content moderation processes. In response to such issues, Government-based approaches have been mentioned as a solution to aid tensions; however, this solution also involves a complicated conversation and presents its own shortcomings that include further bias.

 

Considerations within Content Moderation

The rise of digital platforms has in a large way characterised the previous two decades. From developments amongst search engines, to the polarizing growth of social media, the presence of digital platforms has become highly prevalent. As they have become such global phenomenon’s so seamlessly intertwined with the everyday, so too have the criticisms surrounding their policy and regulation. Content moderation is therefore not a question of if, but how. As Gillespie (2018) expressed:

Platforms must, in some form or another, moderate: both to protect one user from another, or one group from its antagonists, and to remove the offensive, vile, or illegal—as well as to present their best face to new users, to their advertisers and partners, and to the public at large. (p. 5)

The implementation of content moderation, however, has proved to be challenging and called attention to various issues. There are many expectations for digital platforms that range from settling disputes, to unifying norms, and interpreting laws, all while enforcing their unique and individual rules (Gillespie, 2018, p. 5). The challenges surrounding content moderation not only lie with what is appropriate, but also with defining repercussions and the forms of punishment for those rules that are broken (Gillespie, 2018, p. 6). The list of considerations and questions then become extensive when it comes to the moderation and regulation of digital platforms, and the answers to such questions are anything but clear. 

 

Implications of Bias

Forms of content moderation such as the flagging and removal of inappropriate or ill-suited content are particularly visible on social media platforms and involves both algorithms and employed moderators within processes of moderation (Common 2020). Of course, there remains to be a point where the enforcement of these processes presents the possibility for bias and inconsistency (Common, 2020, p. 132). Whether coming directly from an individual, or an algorithm, bias is a feature often left unmentioned despite it being a telling feature of content moderation. For instance, the values and norms of digital platforms are reflected through their policies and methods of regulation, but so too are any prejudices, opinions, and even politics. Bias is there for detectable both directly through human moderation, and indirectly through the algorithms created by individuals (Common, 2020, 132). Processes of moderation then are dependent on the already existing values, norms, and opinions of the individuals at the front of the platforms policy, determining their standards, exceptions, and overall concerns, leaving a vast room for bias.

 

[Figure 1]
One example of bias that can be found within processes of content moderation relates to the topic of blood. As Common (2020, p. 133) highlights, there have been instances where an image depicting a woman’s menstrual blood has been found unacceptable, while on the other hand an image of a man lying in a pool of his own blood was acceptable. This contradiction between what depiction of blood is in a sense appropriate for online signifies how biases can at times be reflected through content moderation. Half the population are accustomed to menstrual blood within their lives, and it does nothing to represent violence as the latter does. This example then also suggests the importance in having considerate moderation processes that do not fail to be inclusive, inhibit a group of users, or amuse norms that suggest a woman’s menstrual blood is offensive or repugnant.

 

Commercial Content Moderation

Commercial content moderation is a term that refers to the specific screening of user-generated content that is posted to internet sites, social media, and other online spaces (Roberts, 2019, p. 33). Commercial content moderation is unique in the way that it is crucial in maintaining and protecting platform brands, and they do so by enforcing users to conform to site guidelines and rules (Common, 2019, p. 34). A distinct issue with this is that much of the decision making surrounding the appropriateness of content is made on behalf of workers under prepared for the job. Commercial content moderation calls on workers to perform extensive hours of repetitive, stationary work and often the workers employed are removed from the region of which the content originated (Roberts, 2019, p. 50). There are necessary qualities required for a moderator, these include linguistic competency within the language of the content, an understanding of relevant laws governing the sites country of origin, as well as a knowledge in the platform’s specific guidelines (Roberts, 2019, p. 35). While commercial content moderation does call on a large quantity of workers, the quality of moderation is left more increasingly inconsistent, leaving the door open to greater levels of inconsistency.

[Figure 2]

The Challenge of Unification

A well-known example that further explores the difficulties in content moderation is represented in the mixed responses to that of a photo taken by Nick Ut, titled The Terror of War (Gillespie, 2018).

 [Figure 3]
In the aftermath of Facebook deleting an article that included the photo, and extensive critics on their deleting of the post, Facebook voiced the difficulty in their decision. Their Vice President at the time Justin Osofsky explained that “in many cases, there’s no clear line between an image of nudity or violence that carries global and historic significance and one that doesn’t” (Gillespie, 2018, p. 3). He went on to explain that what may be considered acceptable and relevant for some, may not be for others and that it is challenging to find balance between ‘enabling expression’ and ‘protecting our community’ (Gillespie, 2018, p. 3). Here lies an example of where tensions may come to rise between values and norms. The reach of social platforms such as Facebook are globally extensive and when content such as this image is shared, they can act as the catalyst to divisive debates between suitability and significance. Drawing to attention the challenge of unifying largely diverse norms.

 

A Government-based Approach?

In the light of issues such as those previously mentioned, increasingly government-based approaches have been suggested. However it is likely issues with content moderation will remain. At the beginning of 2020, hearings that involved the CEOs of both Google and Facebook provided that ‘technological literacy for the top lawmakers in the country is low’ (Mikaelyan, 2021, p. 208). Mikaelyan (2021, p. 208) suggested the large stake that politicians hold in terms of content moderation, and with media being such a useful forum of connection to the public, it also presents the opportunity for ‘oppressive legislation that stifles speech’. As well as this, interactions within digital platforms often occur on an international scale, for a government or nation state to be the primary body of responsibility when it came to moderation may lead to overbearing control within a space traditionally tokened for its freedom. As Flew, Martin, & Suzor (2019, p. 46) outline, a major obstacle is trying to unify diverse nations standards for what is and is not acceptable within online spaces. Hence why navigating conflicting legal systems, and an increasingly government-based approach would pose new challenges within itself.

facebook testify zuckerberg
[Figure 4]

There are significant challenges and issues that come to rise in relation to the content moderation of digital platforms. Undoubtedly the extensive scale and global reach of such platforms only creates further questions into how to moderate so that all individuals can interact equally and safely online. Current processes of content moderation highlight the need for more considerate and well-focused methods, yet with such a large intersection of people from all scopes of the world, there remains to be no clear path to the unification of values and norms, nor is there a clear solution in government-based approaches.

 

 

Reference List

Common, M. (2020). Fear the reaper: how content moderation rules are enforced on social media. International Review of Law, Computers & Technology, 34(2), 126-145. doi: 10.1080/13600869.2020.1733762

Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: rethinking the question of digital communication platform governance. Journal of Digital Media & Policy, 10(1), 33-46. doi: 10.1386/jdmp.10.1.33_1

Gillespie, T. (2018). All platforms moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (1-23). Yale University Press. doi: 10.12987/9780300235029

Mikaelyan, Y. (2021). Reimagining content moderation: section 230 and the path to industry government cooperation. Loyola of Los Angeles Entertainment Law Review, 41(2), 179-214.

Roberts, S. (2019). Understanding commercial content moderation. In Behind the Screen: Content Moderation in the Shadows of Social Media (pp. 33-51). Yale University Press. doi: 10.12987/9780300245318