The Presence of Digital Platforms Today

“Social media icons collection” by Ibrahim.ID is licensed under CC-BY-SA-4.0

With the constant rise and development of new technologies throughout modern culture, the world has been opened to new and exciting inventions and innovations. Over the past 20 years, these technological advancements have played a vital role within society, impacting the way we interact, learn, and think. Starting with Tim Berners-Lee’s introduction of the World Wide Web in the 1980s, society has dramatically expanded the way we use the internet, growing to the digital evolution we call ‘Web 2.0’.

Coined by Darcy DiNucci, the term ‘Web 2.0’ refers to the changing nature of the World Wide Web and the next generation of internet technologies designed to “enhance creativity, secure information sharing, increase collaboration, and improve the functionality of the web as we know it” (Pacelt, 2021). Furthermore, the ‘Web 2.0’ era have allowed people to gather and manage groups based off their common interests, enabling users the ability to socially interact with one another; bringing forward the creation of the current digital platforms we see throughout our everyday lives.

Today, digital platforms have become such a commonplace item within society, with various types of digital platforms such as social media platforms, knowledge platforms, media sharing platforms and service-oriented platforms (Watts, 2020) surrounding the internet. However, with these digital platforms present and the ever-changing nature of social culture, there have been several issues that have arisen because of it.

A common problem seen within digital platforms is the issue of new regulatory constraints, typically constraints seen throughout various social media and media sharing platforms like FacebookTwitterInstagram and YouTube. These platforms have become such a mainstream tool in society, allowing people to “build relationships, connect with companies, conduct research, share content, and even purchase products” (Durfy, 2018), however as platforms like these continue to change and grow in content, so too does the rules and regulations that manage and moderate these specific sites.

What is ‘Content Moderation’?

With over 4.66 billion active internet users (Johnson, 2020) exposed to online content, and the mass of big digital platforms continuing to rise, it is considered crucial for big brand companies to monitor and moderate content in order to maintain a healthy community. Today, it is evident that current social media companies determine the content we see, with each media platform differing in terms of the rules and regulations they place. Thus, this brings us to the key process we call ‘content moderation’. “Content moderation is the organized practice of screening user-generated content posted to internet sites, social media, and other online outlets” (Roberts, 2019). Content moderation ensures users are protected from illicit content online as well as allow big brand companies to better understand their product users. However, there have been numerous concerns about the issues these systems can impose on internet users today.

“Passe-partout (computer, e-mail, online)” by Wies_van_Erp is licensed under CC BY-ND 2.0

Issues with Content Moderation

Over the past few years, content moderation has faced many ongoing concerns throughout society. The internet is filled with numerous amounts of explicit online content such as fake news, pornography, drugs, and violence, which makes it hard for certain media platforms to maintain and moderate these offensive and explicit activities. However, there are some digital platforms today that employ people to moderate any content shared online. This process known as ‘user moderation’ has stemmed some arguments regarding freedom of speech and the accountability people have as role of moderator.

Another thing to note is that society is always evolving, which means that rules and regulation policies must also be updated. Tarleton Gillespie suggests that “one of the biggest challenges platforms face is establishing and enforcing a content moderation regime that can address both extremes simultaneously”. In this instance, Gillespie suggests that moderators have a meticulous role in moderation and must remain cautious as well as “maintain sensitive judgement” when deciding what is considered appropriate content.

Attempts to Implement Content Moderation and the Controversies that Followed

Today, many internet users are familiar with the digital platform, YouTube, a media sharing site in which allows users the ability to publicly share and watch videos online. Being quite a popular site, YouTube is no stranger to backlash regarding content moderation and regulations. One prime example of YouTube’s attempt to implement content moderation is ‘YouTube Heroes’ back in 2016, which sparked controversy among the YouTube community. In September 2016, YouTube announced a new incentive program called ‘YouTube Heroes’ which rewarded users points for anyone who contributed to the site by moderating comments, captioning videos, and flagging videos that violated YouTube’s Terms of Service (Smyth, 2016).

YouTube in iPhone screengrab” by Hello I’m Nik is licensed under CC-BY-SA-4.0

Many people considered this as a form of censorship, with the program allowing its users the ability to flag multiple videos posted online. This flagging system had the potential of letting certain videos lose monetisation, get age restricted or even get taken down from the site, which many content creators feared (Gorey, 2016).

“youtube JPG” by renaissancechambara is licensed under CC BY 2.0

With this incident now seen as ‘forgotten history’, ideas of censorship are still quite prevalent throughout the YouTube community today. In recent years, copyright policy violations have become such a prominent issue within the internet, with new automated systems detecting and enforcing any illicit content that is published online. In some instances, these policies have been beneficial in removing any explicit content shared online, however, since these enforcement systems are automated, we have no control over what it flags as illicit and therefore lose our creative freedom as creators (Trendacosta, 2020). Today, this issue affects many content creators within the media industry, as some copyright regulations restrict and police the creative content, they share for their viewers to see on the internet.

Government roles in Enforcing Content

Government’s involvement within content moderation has become such a prominent issue within today’s modern society. With new government laws and the evolution of the internet, the need for a stabilised moderation system is essential. With this government have proposed a number of policies that they believe would support digital platforms and their practices. Some of these propositions included: centralising the protection of marginalised communities, increasing the potential of public figures to drive online and offline harms, and the removal of hate speech and harassment (Díaz & Hecht-Felella, 2021).

Although the governance of content moderation can be beneficially to specific digital platforms, they can also “result in the detrimental outcomes for free speech within democracies” (Mchangama, 2021). This has caused a social divide, with some people opposed to this idea, suggesting that government involvement will impose the freedom of speech, limiting comments they dislike and even silence civil society or other critics (United Nations Human Rights, 2021).

Conclusion

Although the use of content moderation is considered tangible in filtering out explicit content, there are still some issues that could potential arise for these systems within the upcoming future. Today, we live in an era where digital platforms have become such a necessity within our daily lives, with people constantly being exposed to online content every day. The constant growth of internet culture has shaped the way people socially interact and therefore also shape the change of rules and regulations placed within certain digital platforms. The era we call, ‘Web 2.0’, has highlighted “user-generated content, usability and interoperability for end users” (Serp Wizard, 2017), and with the changing nature of the internet, who knows what advancements ‘Web 3.0’ and ‘Web 4.0’ may bring in the foreseeable future.

References

Díaz, Á., & Hecht-Felella, L. (2021, August 4). Double Standards in Social Media Content Moderation | Brennan Center for Justice. Www.brennancenter.org. https://www.brennancenter.org/our-work/research-reports/double-standards-social-media-content-moderation

Durfy, L. (2018, December 20). Social Media for Every Generation. PostBeyond; PostBeyond. https://www.postbeyond.com/social-media-generations-2/

Gillespie, Tarleton. (2018) All Platforms Moderate. In Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. Yale University Press. pp. 1-23.

Gorey, C. (2016, September 22). Big YouTubers are not happy with YouTube Heroes concept. Silicon Republic. https://www.siliconrepublic.com/play/youtube-heroes-criticism-twitter

Johnson, J. (2020, July 24). Global digital population 2020. Statista. https://www.statista.com/statistics/617136/digital-population-worldwide/#:~:text=How%20many%20people%20use%20the%20internet

Mchangama, J. (2021, January 26). Rushing to Judgment: Examining Government Mandated Content Moderation. Lawfare. https://www.lawfareblog.com/rushing-judgment-examining-government-mandated-content-moderation

Pacelt, O. (2021, March 9). Story of the Internet. From Web 1.0 to Web 4.0. Botland. https://botland.store/blog/story-of-the-internet-from-web-1-0-to-web-4-0/

Rieza, E. (2020, April 24). 3 Reasons Content Moderation is Key for Business. New Media Services. https://newmediaservices.com.au/content-moderation-3-reasons-why-it-is-crucial-for-your-business/

Roberts, Sarah T. (2019) Behind the Screen: Content Moderation in the Shadows of Social Media. New Haven, CT: Yale University Press, pp. 33-72.

Serp Wizard. (2017, April 5). SEO Services by SERP WIZARD. Serpwizard.com. https://www.serpwizard.com/differences-between-web-3-0-and-web-2-0-websites/

Sharma, M. (2018, September 24). Web 1.0, Web 2.0 and Web 3.0 with their difference – GeeksforGeeks. GeeksforGeeks. https://www.geeksforgeeks.org/web-1-0-web-2-0-and-web-3-0-with-their-difference/

Smyth, C. (2016, September 24). YouTubers Are Furious At The New Flagging System. BuzzFeed. https://www.buzzfeed.com/cassiesmyth/youtube-have-introduced-a-new-flagging-system-and-youtubers

Tehrani, A. (2021, May 21). Council Post: Why User-Led Moderation Could Be The Answer To The Content Moderation Problem. Forbes. https://www.forbes.com/sites/forbestechcouncil/2021/05/21/why-user-led-moderation-could-be-the-answer-to-the-content-moderation-problem/?sh=282e321b4c4d

Trendacosta, K. (2020, December 10). Unfiltered: How YouTube’s Content ID Discourages Fair Use and Dictates What We See Online. Electronic Frontier Foundation. https://www.eff.org/wp/unfiltered-how-youtubes-content-id-discourages-fair-use-and-dictates-what-we-see-online

United Nations Human Rights. (2021, July 23). OHCHR | Moderating online content: fighting harm or silencing dissent? Www.ohchr.org. https://www.ohchr.org/EN/NewsEvents/Pages/Online-content-regulation.aspx

Watts, S. (2020, July 8). Digital Platforms: A Brief Introduction. BMC Blogs. https://www.bmc.com/blogs/digital-platforms/#