Facebook was right

Last week, Gather the Jews asked its readers whether Facebook was right or wrong to take down a page calling for a “Third Intifada” against Israel.  In this post, Samara Greenberg argues that Facebook made the right decision by taking the page down. Yaakov Roth offers a competing perspective in this post.

Samara Greenberg is a research associate at the Jewish Policy Center and assistant editor of inFocus Quarterly.

—-

Samara Greenberg

One month after it appeared, the Facebook page calling for a Palestinian intifada was removed after the social network faced widespread condemnation. In a letter of explanation, Facebook stated, “we do not typically take down content that speaks out against countries, religions, political entities, or ideas. However…when they [Facebook pages] degrade to direct calls for violence or expressions of hate – as occurred in this case – we have and will continue to take them down.”

The page, titled “Third Palestinian Intifada,” called for an intifada to take place on May 15, 2011—the day Palestinians mourn Israel’s independence. The organizers called on Palestinians to attack Israeli settlements, stating, “Judgment Day will be brought upon us only once the Muslims have killed all of the Jews.” The page had more than 340,000 fans.

While some may disagree as to whether Facebook should have removed the page, it undoubtedly had the legal right to do so. Each Facebook user agrees to the company’s terms of service when creating an account, thereby consenting to Facebook’s removal of content that violates the terms, which includes content that “is hateful, threatening, or…incites violence.” Not only did the “Third Palestinian Intifada” page “by its very title incite[s] violence,” but its users and administrators also called for violence against Israelis.

But what of the users’ freedom of speech rights? Courts have consistently held that the First Amendment “does not protect against actions taken by private entities, rather it is a guarantee only against abridgment by government, state or federal.” See Noah v. AOL Time Warner, 261 F. Supp. 2d 532 (E.D.Va. 2003); Green v. America Online, 318 F.3d 465 (3d Cir. 2003). In addition, courts have found that internet service providers (ISPs) are protected by their own First Amendment rights, namely freedom of the press. See Langdon v. Google, 474 F. Supp. 2d 622 (D.Del. 2007). Finally, Section 230 of the Communications Decency Act immunizes ISPs from liability for “any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.”

With Facebook’s legal rights cleared, the only remaining question is whether any compelling arguments exist in support of subjecting Facebook to its users’ First Amendment rights. The answer to this question requires weighing the interests of the parties involved against the effects of regulation in the pursuit of striking a more equitable balance of “internet rights” than that which is created naturally by free market forces. A brief discussion about one of the most controversial attempts to regulate content providers illustrates that such endeavors generally prove to be futile, if not toxic.

Between 1949 and 1987, the Federal Communications Commission (FCC) enforced a policy known as the “Fairness Doctrine,” whereby broadcast stations were required to offer equal airtime for discussing contrasting viewpoints on issues of public importance. Stations were monitored and those that failed to comply risked losing their broadcasting license. In order to avoid the increased cost and risk associated with finding opposing viewpoints, some journalists avoided covering “controversial” issues altogether—the exact opposite of the doctrine’s intention. Another indirect consequence of the doctrine is illustrated by the broadcast media boom that took place following its repeal; the number of stations that broadcast talk radio programs nationwide increased from, at most, a few hundred in 1980 to more than 1,500 today. Simply put, government regulations increase the cost of doing business, burden existing companies, and dissuade potential start-ups from entering the market.

Legally, Facebook has the right to remove user content that violates its terms of service. In the future, U.S. courts could decide to regulate internet providers’ content, but such regulations would bring about unintended consequences, such as stifling business growth and, ironically, free speech. On the other hand, the question of whether or not Facebook should have taken down the intifada page is subjective. As a social network, the company must constantly try to find the balance between creating an open atmosphere and maintaining a safe one. This time, Facebook made the right decision. A page that calls for carrying out violence against any person or group can lead to harm, or even death, and therefore should be removed.