•  
  •  
 

San Diego International Law Journal

Authors

Trent Scheurman

Library of Congress Authority File

http://id.loc.gov/authorities/names/n79122466.html

Document Type

Comment

Abstract

This Article will compare 47 U.S.C. § 230 (“Section 230”), the United States law governing civil claims that prevent social media companies from being treated like the publishers of their own users’ posts and the companies’ abilities to remove user posts, with the European Union’s (“EU”) equivalent governing law, the E-commerce Directive. The E-Commerce Directive will be used as an example of a governmental regulation that better prevents viewpoint discrimination, but at the cost of a lower standard of user expression. A lower standard of user expression means diminished rights in exercising free speech, as exemplified by the EU outlawing broader categories of speech than the US (Section III covers this point in detail). Then, this Article will demonstrate how the US may achieve the goal of decreasing discretionary power of platforms’ content removal abilities, thereby minimizing viewpoint discrimination of lawful user-posted content, while preserving private governance of social media business practices.

Section II provides background on social media users, platform content regulation, and content removal practices. It continues with a discussion of the enormous amount of content social media platforms are responsible for monitoring and governing. Additionally, the relationships of social media companies, governments, and users are explained in connection with social media content moderation. Lastly, Section II summarizes the First Amendment’s boundaries on protection of speech and clarifies freedom of expression for US citizens only from government actors, leaving private platforms content removal practices currently out of the First Amendment’s reach.

Section III lays out the social media content regulation laws governing both the US and the EU. Historically, the US’s Section 230 has been referred to as the “26 words that created the internet” due to its thorough protection of private online platforms from third-party (“intermediary”) liability arising from civil suits like defamation, and general allowance for platforms to leave up or take down content voluntarily. Contrarily, the EU’s E-commerce Directive offers platforms safe harbor from legal liability with two main requirements: the platform must (1) not have “actual knowledge of illegal activity,” and (2) “act expeditiously to remove” illegal activity once actual knowledge is obtained. This section concludes by illustrating the EU’s approach to social media content regulation and reviewing its implications on viewpoint discrimination in social platform content moderation.

Section IV discusses the deficiencies of Section 230 in its approach to platform content moderation. The analysis will continue with the three main problems arising from Section 230’s current application, which allows social media companies: (1) overbroad discretionary authority, (2) the ability to operate with limited transparency, and (3) the ability to discriminate based on viewpoint. Additionally, the Article will explore the implementation and significance of the ground-breaking independent Facebook Oversight Board on providing an appellate process for wrongful censorship of posts.

Section V proposes two solutions to the three previously listed issues of Section 230(c) addressed in this Article. The first solution is statutory revision of Section 230(c)(2). There are two statutory revisions proposed in the first solution: (1) revision of the statute to grant immunity to social media platforms only if the platforms remove content that is illegal or otherwise unprotected by the First Amendment, and (2) introduction of a “bad faith” clause that removes platform immunity if the plaintiff can prove their lawful post was removed as a result of viewpoint discrimination. The second solution suggests federal statutes mandating large social media platforms create their own independent oversight boards. These Social Media Oversight Boards will be primarily based on Facebook’s Oversight Board, with the new Oversight Boards’ purpose being independent review of platform censorship practices through a board review process.

COinS