Content Moderation on Social Media Platforms

Platforms have a great responsibility for social and public ideology, which is the state, culture and political economy. As an important medium for connecting users, social media helps enhance the flow and exchange of information. Digital platforms, although not acting as producers of content, have a significant influence on society and public ideology. While governments lack substantive intervention, they should call on platforms to monitor themselves. Social media should take the initiative to be responsible and take regulatory action towards public values.

“Automotive Social Media Marketing” by socialautomotive is licensed under CC BY 2.0

What is the platform?

The term ‘digital platform’ can be broadly defined, and they can often be information technology, commercial and complex organisations that provide a diversity of online services. This article will focus primarily on social media platforms, which act as an intermediary allowing users to post, create and republish information. Social media also functions as an interactive community that interconnects and influences users, platforms, company laws and policies.

 

The History and Development of Platform Content Moderation

With the development of social media, platforms have realised the importance of monitoring user content.  The earliest social media saw themselves promoting public expression and not interfering with content (Boyd, 2011). In the early days of social media, the news media often labelled the few reviewed platforms as ‘hypocritical’ (Gillespie, 2018, p8).  In the web 2.0 era, audiences no longer receive information, and they are editing and producing content. Now, unlike the early days of the internet, platforms are aware of the rules that govern user behaviour based on a large amount of information disseminated and the active user base. Platforms are constantly and rapidly evolving, and they exist in various forms as a multicultural community. They are mainly based on a flow of information, gaining traffic and popularity, and developing commercial services, including advertising, based on many users. Different with the early internet, amid a sizeable exponential growth of the user base, platforms fabricate the challenge of regulating new forms of governance over content (Kerr & Kelleher, 2015).

Platform Business Form and Regulation

Social media are economic institutions that gain benefits through many users, advertisers, partner companies. (Gillespie, 2018, p11). Platforms will contain many other media, such as advertisements or links to other platforms, and they act as a third party to avoid many responsibilities. Often many platforms will be linked together to form a business model. With the rapid growth of social media in recent years, there is intense competition between companies. Fundamentally, all social media are commercial, for-profit ventures to make a profit. It will drive them to work with various advertisers or media companies.

However, platforms face a double-edged sword that strikes a balance between over-regulation, moderation and freedom of expression. Many platforms are set up for entrepreneurial purposes, and they usually want more users and a livelier community. Their profitability lies in attracting more users that platforms often collect data on users, personalise it according to their interests and preferences, and push ads to these ‘target groups’ (Martin 2017). While attracting more users, the platforms become providers and cooperate with advertising, often forming an essential service and online media structure from which profits can be made. From an economic point of view, increased regulation will lose some users or reduce the amount of time they spend on the platform due to freedom of expression and the opportunity to work with advertisers (Kyle, 2016). As the number of users expands and websites become a diverse community, monitoring and addressing the issues in between is a significant challenge.

 

Society, Culture, Ideology, and Values

While some people believe that platforms should give people a hands-off model that everyone should be allowed to retain creativity, the right to speak freely, reproduce content, and share and edit information without barriers. Nevertheless, as the number of users grows, more and more voices are appearing on social media. The values and cultures of diverse international users are very different, and many platforms spend a lot of time and staffing on content review.

Globally, platform regulation is complex, that whether some users are willing to be restricted while enjoying the freedom of expression becomes a thorny issue. Platform management is often a complex and challenging process. In response to users who are unwilling to abide by the platform’s rules, Facebook strictly regulates the algorithm of tweeting messages and has the right to suspend users who violate the community rules (Tobin, Varner & Angwin, 2017).
     “Algorithm” by Pxhere is licensed under CC BY 2.0

 

Adrienne (2015) suggests that platforms need to face some socio-cultural and public opinion-oriented responsibilities must make censorship and logistical efforts to control user behaviour and content to guarantee that platform content is legal and ethical. Platforms should improve their intervention systems taking into account the impact of undesirable content on the platform on society. Pornography, violence, hate speech, self-harm or some  propaganda content of war and terrorism have a severe negative impact on the public and society’s ideology and guidance. In order to protect the experience of each user, platforms have a responsibility to limit content. Many vulnerable people are offended and harmed on digital platforms, making them expect that platforms should intervene (Kayyali & o’Brien, 2015). While the development of algorithms that automatically block keywords can effectively protect users, platform regulation often relies on extensive human labour. Platforms often have internal policy teams and oversight audits that are silent but under tremendous pressure (Chen, 2014).

Scrubbing the net: The content moderators – The Listening Post. Source: Youtube –https://www.youtube.com/watch?v=IpqKRaKiG-w

Government Intervention in Moderating Social Media Content

Whether the government should increase its intervention and regulate platforms for social media content review is a highly debated topic. Governments face public pressure to give legal effect to content regulation concerning the extent to which platforms and such industries ‘self-regulate ‘their effectiveness. Whereas governments are unlikely to intervene in social media, they require platforms to ‘self-regulate ‘in consideration of vulnerable populations and the influence of society’s culture and ideology as a whole (Gillespie, 2017, p24). Although it is costly to track down individual users due to their anonymity and mobility, governments demand that platforms increase their responsibility to set guidelines and intervene with content. Governments should increase the openness and transparency of the processes required to regulate platforms and have an obligation to play a favourable legal and cultural role (Amélie & Stephan, 2021).

“Facebook Press Conference” by Robert Scoble is licensed under CC BY 2.0

 

The government should regulate platforms and web pages to shape an equal and inclusive environment of shared values. However, it is practically impossible and financially unaffordable for governments to hold providers actively responsible for what users say or do (John, 2019). Despite the lack of explicit legal constraints on social media, platforms are responsible for acting as intermediaries and fulfilling their responsibility to organise public discourse. The government should regulate the impact of politics on public sentiment and social order when it comes to controlling, especially some physical attacks on society by terrorism or hate speech. Mark Zuckerberg highlighted that Facebook is working closely with the government for governing some politically sensitive topics and terrorism topics (Richard, 2018). With the rapid rise of users, governments have developed different initiatives to combat radicalised speech on the platform. Platform content often responds to foreign governments’ requests to remove offensive content (FEC, 2003).

Conclusion

There are calls for the increasing responsibility of platforms, which should be managed not just by compulsion of law but more voluntarily out of a sense of public responsibility. With the rapid growth of the internet, platforms should strengthen socially, culturally, and ideologically harmful radical content management. While not the publishers of content, platforms have a decisive role to play in constituting public discourse. The government should call on platforms to actively regulate themselves. Social media should take responsibility for fairness and social and cultural values and take the initiative to monitor and regulate.

References:

Adrienne, L. M. (2015). Participatory Culture, Community, and Play: Learning from Reddit. International Academic. New York: Peter Lang Inc, International Academic.

Amélie, H., & Stephan, D. (2021). Competent Third Parties and Content Moderation on Platforms: Potentials of Independent Decision-Making Bodies From A Governance Structure Perspective. Journal of Information Policy (University Park, Pa.), 266–300. Retrieved from https://doi.org/10.5325/jinfopoli.11.2021.0266

Boyd, D. (2011). Social network sites as networked publics: Affordances, dynamics, and implications. In Z. Papacharissi (Ed.), A Networked Self: Identity, Community, and Culture on Social Network Sites (pp. 39-58). Routledge, New York.

Chen, A. (2014). The Laborers Who Keep Dick Pics and beheadings out of Your Facebook Feed. Retrieved from http://www.wired.com/2014/10/content-moderation/

Foreign Nationals Brochure. (2003). Federal Election Commission.

Gillespie, T. (2018). Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press.

Gillespie, T. (2017). Governance by and through Platforms, in J. Burgess, A. Marwick & T. Poell (eds.), The SAGE Handbook of Social Media, London: SAGE, pp. 254-278.

John, S. (2019). Why the Government Should Not Regulate Content Moderation of Social Media. Policy Analysis.

Kayyali, N., & O’Brien, D. (2015). Facing the challenge of online harassment. Retrieved from https://www.eff.org/deeplinks/2015/01/facing-challenge-online-harassment

Kerr, A., & Kelleher, J. D. (2015). The Recruitment of passion and community in the service of capital: Community managers in the digital games industry. Critical Studies in Media Communication, 32(3), 177-192.

Kyle, L. (2016). The Doctrinal Toll of “Information as Speech”.  First Amendment case law.

Martin, H. R. (2017). Commercial Speech and the Values of Free Expression. Cato Institute Policy Analysis.

Richar, A. (2018). Hard Questions: Where Do We Draw the Line on Free Expression?. Facebook Newsroom.

Tobin, A., Varner, M., & Angwin, J. (2017). Facebook’s Uneven Enforcement of Hate Speech RulesA llows Vile Posts to Stay Up, ProPublica. Retrieved from https://www.propublica.org/article/facebook-enforcement-hate-speech-rules-mistakes