Platforms and Content Moderation. Are they doing enough?

Assignment 2 - Tut 02 Yuan Jiang

Content moderation on social media platforms is a topic that is hotly debated, based on their role in influencing society and politics, and their moral obligation to citizens and users. Comedian and award winner of the ADL International Leadership Award, Sacha Baron Cohen, famously in his 2019 acceptance speech, “Never is Now”, explained that whilst there are complexities with content moderation, platforms such as Facebook are simply not doing enough to protect their users and online communities. The ADL are an organisation that strive to eliminate racism, hate and bigotry, and by naming Facebook as being the largest publisher and propaganda machine in history (Baron Cohen, 2019), Sacha along with many other academics, emphasise why and how platforms need to be moderated and why this is so important to protect our society and democracies for the future.

 

Never Is Now 2019 | ADL International Leadership Award Presented to Sacha Baron Cohen. Source: Youtube – https://www.youtube.com/watch?v=ymaWq5yZIYM&t=1151s

 

What exactly is content moderation, and why do we need it?

Gillespie in his chapter “All Platforms Moderate” explains that moderation is part of the design and infrastructure of platforms and how they operate; platforms are simply not platforms without moderation (Gillespie, 2018). He defines the role of platforms as being to circulate users’ shared content without having to commission that content themselves, as well as being built on an infrastructure that processes data for profit, advertising, and customer service (Gillespie, 2018). As a result of their role and financial aims, which strives to keep users engaged for a maximum amount of time to share and view content, it is therefore necessary for platforms to orchestrate this content so that they can best achieve this. This proves that the concept of platforms being “impartial”, “open” and “unregulated” is unfortunately a fantasy as moderators are required to pick and choose all the time (Gillespie, 2018). Content moderation consists of shaping what content is allowed to be published, as well as how groups and individuals are allowed to interact on their platforms. This act is largely controversial based on the power of these platforms, and these decisions being in the hands of only a few, CEOs of the “Silicon Six”, yet with the potential to influence billions of users. This power disparity further highlights why moderation is so important.

“iPhone 6s with icons of social media on screen. Smartphone life style smartphone. Starting social media app.- Credit to https://www.lyncconf.com/” by nodstrum is licensed under CC BY 2.0

 

Platforms such as Reddit have undergone serious criticism in the past based on the design of their platform which has chosen not to interfere or offer little support for discouraging material, and their “hands-off” approach to community governance on their platforms associated with encouraging toxic techno-cultures (Massanari, 2017). As all platforms are required to moderate to some degree, by choosing this “neutral” stance, reddit administrators are deliberately designing the type of culture and content they want shared on their platform, arguably to encourage a sense of play and candour on their platform (Massanari, 2017). The #GamerGate harassment movement was a consequence of these choices, whereby game creator Zoe Quinn became the target of shared private conversations, and a centrepiece in a hateful campaign to delegitimise and harass women in the gaming community (Massanari, 2017).

 

By allowing content to be shared and creating an environment that encouraged this, Reddit is an example of how administrators make decisions daily for their platforms which can ultimately shape the culture that is developed. 

 

“Social Media v1” by the UMF is licensed under CC BY 2.0. Source: Flickr.

Are social media companies above the law?

The full-time work of moderating user-generated content is a heavily debated, as well as relatively unknown and a not-fully-disclosed subject (Roberts, 2019). There are also complexities as to where social media companies fit within the media industry on a global scale, and what regulations they should be required to abide by. This is one of the key challenges of regulation in the digital age, as national policies often conflict and laws are not always straightforward (Flew, Martin & Suzor, 2019).

 

There is an uneven allocation of content regulation resources, cybersecurity demands and laws differing across the globe, with the “Network Enforcement Act” in Germany and the “Code of Conduct for Countering Illegal Hate Speech” governing media organisations in Europe, versus U.S. communications law provisions and constitutional speech protections in the U.S. which do not apply in other jurisdictions (Flew, Martin & Suzor, 2019).This makes governing social media platforms in various countries challenging without generating a “splinternet” which would mean fundamentally different values depending on territorial jurisdictions (Flew, Martin & Suzor, 2019). This is how critics have explained social media platforms are able to behave above the law, in a grey area of self-regulation.

 

“Newspapers B&W (5)” by NS Newsflash is licensed under CC BY 2.0 Source: Flickr.

Based on platforms such as Facebook publishing and circulated content to billions of people daily, Facebook could be considered as the biggest publisher in the World (Baron Cohen, 2019). Due to this influence, they arguably should be regulated as all other media outlets, such as newspapers, film, and television, with clear restrictions on their content as well as being liable to recall and fix their products if they are defective, no matter the expense. Content moderation is important to ensure brand protection, adherence to terms-of-use statements, site guidelines and legal regimes (Roberts, 2019). The power these platforms have in enabling bigotry, racism and hate is why it is crucial that they should have to recall negative content that is published, even if this may not appeal to all users on their platform.

 

Freedom of Expression and Hate Speech

Platforms often justify their neutral or indifferent approach to content moderation by stating their goal for “freedom of expression”. Facebook CEO, Zuckerberg has used this approach when heavily debating against increased regulation on digital platforms by justifying his goal for Facebook being to “uphold as wide a definition of expression as possible” (Baron Cohen, 2019). However, by giving a voice to all citizens, irrespective of their values, such as conspiracy theorists, bigots, and paedophiles, to amplify their views and target their victims, this could encourage more hate and spread misinformation.

Whilst it seems ideal to always show “two sides” of a story, there may not always be truths, and the spreading of lies, for example Holocaust-deniers, are extremely detrimental to greater society and respecting historical facts. Based on their ability to influence our perception of the truth, these platforms have an educational as well as moral obligation to recognise these clear differences (Baron Cohen, 2019).

“Protest against Islamophobia and hate speech” by Fibonacci Blue is licensed under CC BY 2.0. 

Stories that trigger outrage or fear are often those that create the most engagement, which may be the reason why platform algorithms and administrators often choose to ignore moderation and allow these stories to be heard. Donald Trump’s ban on Twitter and Facebook, whilst a controversial decision, is an example of platforms taking action to stop the spread of misinformation, but are they doing enough? Google, being one of the world’s largest companies, continues to profit from hatred by allowing extremists to have a voice, and target their victims with one simple click (Flew, Martin & Suzor, 2019). By enabling total “freedom of expression”, companies are also enabling hatred and lies. The purchasing of political ads continues to run on Google and Facebook. This enables micro-targeting of influential users and can be very powerful in influencing public opinion. Clearly there is a blurred line between what consists of “truths and lies” but many could argue that these platforms are actively choosing not to interfere.

 

What can we do about it?

Moderation is difficult because it is resource intensive and relentless (Gillespie, 2018), however if platforms choose to take a more active and less neutral approach, which may come at the cost of profit due to expenses in hiring more moderators, it will ensure they are recognising their moral obligation to society and holding themselves accountable for the content they are allowing users to post. Platforms are designed to moderate, and therefore they don’t mediate public discourse, they constitute it (Gillespie, 2018). It can no longer be acceptable for these companies to fail to recognise their actions, and we need to start respecting and acknowledging their role and influence as the biggest media organisations in the world through our regulation policies.

 

 

References

 

Baron Cohen, S. (2019). Never Is Now. Speech, ADL International Leadership Award Conference.

 

Flew, T., Martin, F., & Suzor, N. (2019). Internet regulation as media policy: Rethinking the question of digital communication platform governance. Journal Of Digital Media & Policy10(1), 33-50. https://doi.org/10.1386/jdmp.10.1.33_1

 

Gillespie, T. (2018). Custodians of the Internet : Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (1st ed., pp.1-23). Yale University Press.

 

Gillespie, T. (2018). The SAGE handbook of social media (1st ed., pp. 254-278). SAGE Publications.

 

Lemley, M. (2021). THE SPLINTERNET. Duke Law Journal70(6), 1397. https://doi.org/00127086

 

Massanari, A. (2017). Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures. New Media & Society19(3), 329-346. https://doi.org/10.1177/1461444815608807

 

Roberts, S. (2019). Behind the Screen : Content Moderation in the Shadows of Social Media (1st ed., pp. 33-72). Yale University Press.