The problems of content moderation

arin 2610assignment 2

This article mainly discusses the problems that arise during the content moderation process, as well as the disputes arising from the people who implement the content review and the government’s role in the control of content in social media. With the development of the Internet, the initial goal may be to create a place of expression, so that people have a voice. However, today’s Internet is powerless in the face of huge media inventory and content, so there is a content moderation system that has been implemented to ensure efficiency in the same. Faced with huge amounts of information, it is necessary to have someone responsible for monitoring the content to avoid the dissemination of bad content, such as violence and pornography. It is also necessary to convey to the audience information with correct values and in line with the public interest. However, in the face of such a huge amount of information, the power of Internet companies and social media platform leaders is far from enough, and the government also needs to make relevant countermeasures.

“internet explorer” by Sean MacEntee is licensed under CC BY 2.0

 

Content review is mainly to ensure that user-generated content complies with the rules and requirements of the platform. However, content disputes will inevitably arise during the review process. In 1972, the Associated Press photographer Nick Ut took pictures of children naked in the war to show the cruelty of the war and how severe it affected the wellbeing of people. However, this work was deleted by Facebook, which showed its dissatisfaction in offering positivity to the audience. (Gillespie, 2018)This is a very distressing photo for the platform because the rules of the platform and requirements prohibit the transmission of violent or pornographic images, but this image is meaningful. It violates the rules imposed in leading to light because it contains negative capacity, but it conveys the correct values. From the perspective view and guidelines of the Facebook platform, it is correct and vital that this photo was not published. First, this photo was taken without permission from the children, and secondly, after the photo was published, the platform is very likely to be affected by the language barrier and mutual understanding of why there needed a repost on the information. The reason being that different audiences will view this photo from different perspectives, by either thinking that this photo is pornographic, obscene and so on. But from the perspective of the work itself, this is not a dirty photo. Therefore, in terms of content supervision, many problems are beyond control, and the platform itself will cause a lot of trouble.

 

The larger Internet platform requires more content reviewers, these workers need to perform complex information balancing work to protect user safety and users’ online freedom of speech. Workers are immersed in beheading videos, rape videos, and crime scene photos for several hours a day. Even if the salary is good and the working conditions are good, the content they often see will be troubled. Chloe is a Facebook content reviewer. She witnessed the death of a man and developed a mental illness. The video described a scene of a man being murdered. Someone stabbed him a dozen times, and he kept screaming for mercy. Her job is to choose whether this post should be deleted. There may be tens of thousands of content review workers like Chloe in the world, and they are also experiencing this kind of suffering, but the work of content review must be done by someone. These things reflect many bad problems in our society. We cannot solve the evil in human nature, nor can we prevent it from being published online.

 

Can these problems be improved? Can the government play a big role in content review? The government should legislate on content review or make the audience more aware of content review. Many videos or pictures are banned to prevent minors from imitating or learning. In the video, there will inevitably be many incorrect values or behaviors. Because the Internet is a medium that carries a lot of information, it is a way to transmit information very quickly, so if the dissemination of such bad information cannot be controlled, then many teenagers will go astray. For example, many talents show in China has been banned. The reason is to prohibit many minors, even adults, from voting for their favorite idols, helping them, and buying many products endorsed by this idol. The products endorsed by idols, such as milk, can help once at a time by buying a bottle of milk. Then some fans bought a lot of boxes of milk for the idol, and they were wasted in the end. This is extremely wasteful behavior, and its platform also needs to respond to the actions of fans. Law and government are always the ‘finish line of the road towards perfection and having a positive outlook in getting the right content published while awful information is deleted or reported as being inappropriate.

“Milk” by JeepersMedia is licensed under CC BY 2.0

 

The process of content moderation is vital in helping the youths grow up normally and in a safe online atmosphere. However, it is difficult to control what a child sees online and chooses to believe. It becomes evident that celebrities and influencers impact what the youths choose to follow on social media. The government can minimize the stress that comes with perfection from using certain products. For instance, cosmetic surgery puts pressure on young girls and boys to the extend of desiring to change their bodies. It is the main cause of plastic surgery since the influence goes beyond their imagination to fiction. However, content moderation is vital for brands and the only problem arises in the content used to manipulate the minds of people. Having illegal content and over false information should be eradicated through rules to ban the availability of such information. Moreover, the sexual preference-oriented content is harmful to humans since they end up experiencing emotional turmoil that is difficult for their kids to bear. It should be age-appropriate in understanding and having the right audience when advertising things like condoms and family planning methods. If the process of creating awareness is controlled based on the underlying measures put by the government, the online world would be more educative rather than a deceitful space.

 

Consequently, monitoring and ensuring online content is safe for t6he consumers is vital in ensuring it ranges within the government regulations. Some rules do not allow for violent information to be gathered and should be included in understanding the outcome it has on the mental; health of people. Having the right audience targeted with guidelines that fit their age brackets will be a step in maneuvering their progress in life. If the process is executed perfectly, the government can ban false information that misleads the youths. Affiliate marketing is vital to brands but the process can be misleading to the younger generation. Racism and discriminatory information should be hindered from the public posts and anything that divides people based on their physical appearances.

The content moderation process is also very troublesome. It mainly detects and recognizes text, pictures, audio, and video that contain insensitive, politically-related, sensitive politically-related persons, graphic violations, violent terrorism, violations, advertisements, and other spam information, through systematization. The process provides auditing, marking, custom configuration, and other capabilities to ensure corporate content security. On the other hand, to purify the media, let people enjoy the convenience of the network while also being safe. No matter what kind of content review, the specific operations include machine review, manual review, user complaint review, and result review. This is a complex task that must be entrusted to experience and knowledgeable workers so that the audience and the platform can be trusted.

 

In conclusion, digital content platforms, especially social media platforms, have a unique position in facilitating the distribution, use, and processing of unprecedented amounts of content and data by an increasingly globalized network of users, content producers, and advertisers. Regarding the role of such platforms in the management and protection of their participants and their content, there is increasing regulation and commercial review. Many problems can arise during content review and the challenges can be minimized if there is a corporation from the government and the owners of different platforms Not only the platform but also the content reviewer and audience may be affected. The amount of content on the Internet is also increasing. A large number of harmful contents that are not suitable for dissemination harms national security, social stability, and harmony, especially on the growth of young people, and government supervision has become increasingly strict. The necessity and urgency of content review gradually emerged. Content moderation is a comprehensive review of information. Check and verify various user posts as well as digital platforms and websites to ensure the security of the entire platform.

 

Reference list:

Newton, C. (2019, February 25). The trauma floor. The Verge. Retrieved October 15, 2021, from https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona.

What is content moderation & types of content moderation. New Media Services. (2020, December 21). Retrieved October 15, 2021, from https://newmediaservices.com.au/fundamental-basics-of-content-moderation/.

Stevenson, A., Chien, A. C., & Li, C. (2021, August 27). China’s celebrity culture is raucous. the authorities want to change that. The New York Times. Retrieved October 15, 2021, from https://www.nytimes.com/2021/08/27/business/media/china-celebrity-culture.html.

Gillespie, T. (2018).  All Platforms Moderate. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. P1-23. New Haven: Yale University Press. https://doi-org.ezproxy.library.sydney.edu.au/10.12987/9780300235029

Talking tech. Content Moderation and Online Platforms: An impossible problem? (n.d.). Retrieved October 15, 2021, from https://talkingtech.cliffordchance.com/en/industries/e-commerce/content-moderation-and-online-platforms–an-impossible-problem–.html.

Creative Commons License
The problems of content moderation by Dongge Zhang is licensed under a Creative Commons Attribution 4.0 International License.