ISSUES of ”CONTENT MODERSTION“

In the era of the Internet information explosion, quickly disseminated platform content will impact national security, social stability, and the growth of the people. Therefore, the content health and safety of digital platforms need to be taken seriously. Nowadays, media companies are increasingly strict in reviewing platform content, adding manual and machine reviews to improve accuracy, but the content review measures still cause controversy. Such as review of disputed content, freedom of speech, and handling of user privacy. At the same time, the state has also stepped up its supervision over the range of digital platforms. This move has also caused great controversy. Although the control can be more robust after the government joins in the help, it will also make the people feel panic, affecting the degree of openness and transparency. It is easy to form a hegemonic government.

 

the Challenges for Content Moderation 

 

The content review brings many challenges to digital platforms, such as human, financial and technical aspects. In the earliest stage, the risk control of the content can only be reviewed manually by the editor. This method requires a lot of workforce, so the platform manuscript review takes a long time, and the professional level of the personnel is challenging to control violations, and the timeliness is significantly weakened. It is difficult for an average company to afford such a large content review team. After that, AI was added to the content review application. Means such as keyword filtering was born, which directly prevented content platforms that hit preset prohibited keywords from being sent. However, the problem of keyword filtering is also apparent. The accuracy rate is low, and the situation of accidental injury and slippage often occurs, which is not tolerated by the community platform. At the same time, there are two main review mechanisms for content review: publish first and then review and review first and publish. Publish first and check later does not mean that the content has not been studied before it is published, but it needs to be reviewed by the machine before it is posted and displayed. The machine review relies on the accumulation of review rules and thesaurus and is subject to measurable factors such as prohibited words, content length, spam, etc., and cannot block all the content that needs to be stopped. At the same time, because the vocabulary cannot be distinguished according to the context, it is also prone to manslaughter Possible. Although the manual review does not have the various deficiencies mentioned in the above machine review, it is unrealistic to use the manual check for content review with a large amount of data. It also increases the operation and maintenance cost of the content, which is very timely. The requested content will cause delays in information, resulting in poorly communicated productexperiences.

 

about Freedom of Speech for Content Moderation

 

The existence of content review has also causedwidespread controversy among the people and is considered one of the most pressing challenges facing freedom of speech. It is undeniable that digital platformsthat do not need to review content have promoted citizens’ freedom of speech. Traditional speech media often require citizens to show their true identities and therefore make citizens abandon exercising their freedom of speech for various reasons such as retaliation. The platform’s anonymity allows people with many concerns, in reality, to find identification and vent on the Internet, avoid legal and moral responsibilities, and express their thoughts without scruple. Of course, it is more likely to speak freely without any qualifications and review procedures. Because everyone’s speech is indistinguishable on the Internet, without any identity color, the only thing that can attract the attention of other netizens is the thought they want to express. This provides an accessible communication platform, prompts citizens to express their views and opinions fully, and protects human rights. Like Aaron Swartz, he has been fulfilling the ethical principles that he believes in all his life: information sharing and freedom of speech. Since histeenage years, he has relentlessly resisted Internet censorship and called for resource sharing and information exchange freedom. However, we should also see that due to the uneven quality of netizens and the continuous expansion of free speech rights, there have been some adverse effects on the digital platform, such as fraudulent “lottery winning information.” At the same time, some netizens are too extreme and exaggerate their dissatisfaction with society. Some people with similar negative emotions will gather together and make some anti-society remarks to disrupt social order. Therefore, although freedom of speech is indispensable to the development of society as a whole, it is necessary to implement content review measures because digital platforms have caused significant drawbacks to freedom of speech.

 

about Government for Content Moderation

Of course, the practical implementation of content review is inseparable from the government’s support, but it is also easy to form network hegemony. Legislation is undoubtedly a vital measure to regulate and manage the health of platform speech. From a global perspective, countries have made great efforts to explore the freedom of speech protection systems compatible with their primary national conditions, such as the freedom of information law, personal privacy law, and electronic communication privacy law enacted by the United States. Therefore, it is a general trend to regulate the freedom of speech on the Internet in the form of legislation to eliminate the disadvantages caused by the anonymity of Internet speech subjects and the negative influence of the random content of platform speech. But it is undeniable that excessive involvement in the content review of digital platforms can easily make the government become a network hegemon. When the government intervenes in the server and network information content, it is possible to organize and subvert the regime in cyberspace, such as the Snowden incident. Because the government attaches great importance to the value of national politics and security, it monitors and interferes with public privacy through the platform. This has contributed to a crisis of public confidence in the government and state machinery. Moreover, while reducing the welfare and utility of digital media, it also hinders the governance of cyberspace. Therefore, combining the internal management of the digital platform with some assistance provided by the government and fully participating in the content governance of the digital platform with multiple subjects can better improve the scientificity and rationality of network social governance.

 

The Conclusion

In short, the digital platform’s workforce, financial resources, and algorithms have been severely challenged due to the operation of content review, and the content review mechanism combined with artificial intelligence and AI still cannot make content review perfect. Content review can significantly improve the health and safety of digital platforms. However, the disclosure of freedom of speech and personal privacy is still the focus of public controversy, especially concerning the possibility of government intervention in the content review of digital platforms. Will form cyber hegemony. It is undeniable that the digital platform that wants to control the content better is inseparable from the country’s help to formulate relevant regulations. The digital platform will be better developed if the industry’s autonomy is combined with the government’s help.

 

 

REFERENCE LIST

 

Giovanni de Gregorio (2019). Free speech in the age of online content moderation. Voelkerrechtsblog.

https://voelkerrechtsblog-org.translate.goog/de/free-speech-in-the-age-of-online-content-moderation/?_x_tr_sl=en&_x_tr_tl=zh-CN&_x_tr_hl=zh-CN&_x_tr_pto=nui%2Csc

 

John Samples (2019). Why the Government Should Not Regulate Content Moderation of Social Media.

https://www-cato-org.translate.goog/policy-analysis/why-government-should-not-regulate-content-moderation-social-media?_x_tr_sl=en&_x_tr_tl=zh-CN&_x_tr_hl=zh-CN&_x_tr_pto=nui%2Csc

 

Orin Kerr (2015). Edward Snowden’s impact. The Volokh Conspiracy.

https://www-washingtonpost-com.translate.goog/news/volokh-conspiracy/wp/2015/04/09/edward-snowdens-impact/?_x_tr_sl=en&_x_tr_tl=zh-CN&_x_tr_hl=zh-CN&_x_tr_pto=nui%2Csc

 

Sérgio Amadeu da Silveira (2013). Aaron Swartz and the Battles for Freedom of Knowledge. International Journal on Human Rights.

https://sur-conectas-org.translate.goog/en/aaron-swartz-battles-freedom-knowledge/?_x_tr_sl=en&_x_tr_tl=zh-CN&_x_tr_hl=zh-CN&_x_tr_pto=nui%2Csc

 

Stephanie Walker (2020). What is Content Moderation & Types of Content Moderation. New Media Services.

https://newmediaservices.com.au/fundamental-basics-of-content-moderation/