線上平台的出現使意見交流與資訊分享變得更為便利,但同時一些涉及違法的內容也透過平台迅速傳播,對用戶和社會造成了負面影響。各國政府相繼研擬相關法規以有效回應此問題,而掌控演算法與推薦系統的線上平台成為首要課責對象,其中針對不妥內容之審查更關係到人民的言論自由,因此如何在賦予平台法律責任以妥善遏止不妥內容在線上流竄的同時,防止平台濫權並保障人民基本人權,成為立法者應思考的議題。本研究以歐盟《數位服務法》與英國的《線上安全法》草案為研究對象,分析兩國如何訂定平台內容審查的相關程序規範,包括規管之內容類型界定、課予平台的義務與相關審查措施之要求等,並探討兩國法規中政府、平台與社會三方所扮演之角色與對應關係。 英國的《線上安全法》草案關注違法內容和對兒童有害的內容,該草案依服務對象或用戶規模將平台劃分為不同類別,並據以要求平台承擔相應的審查責任,包括進行違法內容風險評估與降低風險評估中偵測的風險。歐盟的《數位服務法》則將管制重點放在違法內容,並依用戶規模賦予平台不同審查義務,尤其超大型平台因影響力大需承擔更多責任。相較於《數位服務法》僅要求平台在收到通知後介入處理內容,並免除其主動監測線上內容之責任,《線上安全法》草案要求平台審查內容的主動程度較高,其規定平台平時應積極降低違法內容之危害風險與縮短其出現時間,更訂有用以協助平台及時發現不妥內容的「主動偵測技術」相關規範。 本研究觀察《線上安全法》草案和《數位服務法》中政府、平台和社會三方的關係。政府主要負責監管平台,確保其遵守法規並履行審查義務,英國的法規也讓政府扮演輔導者的角色協助平台盡到其職責要求。平台為服務提供者,且負有建立機制以維護用戶安全的責任,同時平台需定期提供透明度報告說明法規執行情形。民間社會是服務接受者也同時扮演協助政府監督平台運作的角色,民眾若發現平台中出現不當內容可向平台反映,部分專業人士與機構更可協助識別違法內容、分析平台營運或協助調解糾紛。總體而言,兩部法規皆透過透明度之提升以及多方利害關係人的共同參與實踐數位治理的目標。
The emergence of online platforms has made it more convenient for opinion exchange and information sharing. However, illegal content rapidly spreads through these platforms, causing negative impacts on users and the society. Governments have taken various legislative approaches to effectively address this issue. As online platforms rely on algorithmic systems to control the content users’ access, platforms are one of the primary targets of accountability. Regulating online speech is challenging as it may cause a threat to the freedom of speech, and thus, it is important for lawmakers to come up with a framework that deals with the legal responsibility of platforms to effectively curb the spread of inappropriate online content, while preventing platform abuse and safeguarding people's basic rights. This study focuses on the European Union’s Digital Services Act and the United Kingdom’s draft Online Safety Bill, and analyzes how they deal with content moderation on platforms, including defining the types of regulated content, imposing obligations on platforms, and specifying the required scrutiny measures. The study also explores the roles and relationships amongst the government, platforms, and society as depicted in these regulations. The draft Online Safety Bill focuses on illegal content and content that is harmful to children. It categorizes platforms into different types based on their audience or user base and requires platforms to comply with specific duties, such as carrying out illegal content risk assessments and taking proportionate measures to mitigate the risks identified in the risk assessment. The Digital Services Act mainly deals with illegal content and assigns different obligations to platforms based on their user base. It places much greater responsibility on very large online platforms due to their greater influence. Compared to the Digital Services Act, which only requires platforms to take action upon receiving a notice and exempts them from the obligation of monitoring online content, the draft Online Safety Bill requires platforms to monitor content more actively. It stipulates that platforms should manage the risk of harm to individuals caused by illegal content and minimize the length of time for which the content is present. It also includes regulations on “proactive technology” which assists platforms in analyzing and detecting inappropriate content. This study also observes the relationships amongst the government, platforms, and society in the draft Online Safety Bill and the Digital Services Act. The government's main role is to regulate platforms, ensuring their compliance with regulations. The draft Online Safety Bill also positions the government as a guide to assist platforms in complying with their obligations. Platforms are service providers and shoulder the responsibility to design proportionate systems that ensure the safety of users. They are also required to provide transparency reports explaining any content moderation that they engaged. Civil society serves not only as the user of platform services but also as a role to assist the government in supervising platform operations. If individuals find inappropriate content on platforms, they can report it to the platform. Some professionals and organizations can assist in identifying illegal content, analyze platform operations, or help mediate disputes. Overall, both regulations aim to achieve digital governance through enhanced transparency and the involvement of multiple stakeholders.