Facebook and a Biden Presidency

I was recently interviewed by Jason Murdock, a reporter for Newsweek, for his article, What Could Happen to Facebook Under a Joe Biden Presidency, and Should Zuckerberg Be Worried? The article is really timely and explains some of the issues that both President Trump and his political rival, Vice President Joe Biden, have with social media giants and the regulations surrounding them. In this post, I expand on the quotes I provided in that article regarding Section 230.

What, if anything, happens to Facebook if Joe Biden wins the White House?

It’s difficult to answer this question, from both a practical (i.e. policy) perspective, as well as a legal one. Ultimately, the Executive wields relatively little power to “make or break” a company: it is for the Legislative Branch to regulate, and for the Judiciary to adjudicate when matters come before the court.

Biden has been very outspoken in his dislike for Facebook. Will this translate into real world social media regulation or action, or is it mainly political campaigning?

Putting politics aside, it’s worth focusing the mind on what Biden’s key issues with Facebook are. As explained on his campaign website page dedicated to Facebook (at joebiden.com/facebook) the key themes are principally around political and systemic threats. In other words, Biden is concerned about transparency, misinformation, campaign finance, and political speech.

Although it may be possible to regulate certain aspects of the digital economy and the players which operate within it, Biden’s critiques are both nebulous and incredibly thorny issues. Take, for example, Biden’s call for Facebook to “promote authoritative and trustworthy sources of election information”. This sounds straightforward, but which sources might those be? Perhaps more importantly, who decides which sources are to be deemed “authoritative”?

In the 1940s, the United States Federal Communications Commission (FCC) introduced the Fairness Doctrine, which required holders of broadcast licenses to present controversial issues of public importance in a manner that was honest, equitable, and balanced. By way of analogy, Facebook is in many ways one of the most important broadcasters of our times. However, the Fairness Doctrine was repealed in the late 1980s under President Reagan. Subsequently, the “fairness” or “balanced” reporting of current events has become an issue of much debate. This begs the question: even if Biden were to seek enhanced social media regulation, how would this work in practice? This of course assumes that he has the political capital to get such measures passed in the first instance.

Biden’s third point on his webiste is that Facebook “needs to prevent political candidates and PACs from using paid advertising to spread lies and misinformation — especially within two weeks of election day”. His fourth point is that “needs clear rules — applied universally, with no exceptions for the President —  that prohibit threats and lies about how to participate in the election”.

These points might be somewhat more straightforward to work with – from a legal perspective. The First Amendment typically prevents state actors (i.e. local, State, or Federal government or agencies) to make any laws which attempt to regulate the freedom of speech. Since the 2010 U.S. Supreme Court case of Citizens United, political campaign spending (to include Political Action Committee or “PAC” advertising) has been held as a form of speech. Campaign spending and advertising – even by non-natural persons such as PACs – is thus, broadly speaking, typically protected from government restrictions.

However, the courts have over time applied certain measures to regulate such speech in certain, limited ways. Namely, the government may impose reasonable restrictions on the time, place or manner of constitutionally protected speech which occurs in a public forum. Put differently, it is arguable that a limit on speech within a narrow window of time (e.g. within two weeks of an election) for a very specific purpose (e.g. to prohibit false advertising by a PAC) could be held as not violating the First Amendment. This however is an academic analysis of a hypothetical point.

How likely is it that Biden would, as he has suggested in the past, move to revoke Section 230 protection for social media firms including Facebook?

Section 230 was passed into law as part of the Communications Decency Act of 1996, which is a common name for Title V of the Telecommunications Act of 1996. It was formally codified as Section 230 of the Communications Act of 1934 at 47 U.S.C. § 230.

Although much has happened in the nearly 25 years since it was enacted, s. 230 remains a hugely important aspect of the online ecosystem. This piece of internet regulation effectively offers a form of immunity to social media companies for the content posted by their users. The question addressed here is essentially: “are these tech giants neutral intermediaries, or are they publishers?”

A publisher is generally seen as ultimately having some sort of editorial control, and therefore bears some responsibility for the content on their platforms. Generally speaking however, tech companies are likely to argue that they are the “mere conduits” who simply provide a platform but play not part in controlling the content published. Where this is the case, they argue, they should be shielded from liability.

For context, the legislation comes from the late 1990s and largely sought to provide internet companies with some comfort in the early days of the “information superhighway”. Much has changed since then, and tech companies play a much bigger role in civil discourse today. The Internet of 2004 (FB’s founding year)  and beyond has been driven primarily by user-generated content in which individuals contribute ideas and engage with each other.

It is easy to see how this issue quickly becomes a question about how to balance freedoms of expression, and various other interests of the state. Here I’m thinking about things such as misinformation, hate speech, invasions of privacy, harassment, and so on – all of which society (to varying degrees) will appreciate that the government has a vested interest in regulating.  

What’s interesting to note here, however, is that it is not just Biden who takes issue with the security blanket provided by s. 230. Earlier this year, Trump took issue with some of the policies enacted by Twitter (notably their fact-checking procedures which involved the flagging of some of his tweets as ‘misleading’). It is therefore important to underscore that s. 230 has its critics from across the political spectrum.

The President signed an Executive Order to pare back some of the protections, but this has subsequently been challenged in court. That said, Biden has gone so far as to say that he would want the law to be revoked immediately. Doing so would likely invite much uncertainty, as the underpinning social and economic (not to mention legal!) implications and safeguards have yet to be decided. If s. 230 is revoked and Facebook becomes more involved in moderating content on its platforms, what practical changes will users see?

If Facebook is suddenly made responsible for the content its users post, we may see an introduction of stronger take-down procedures. Alternatively or in the addition, the basic relationship that Facebook and other platforms have with their users may change, too. We might see a shift to a subscription model, whereby users are asked to pay fees in order to use social media websites, or otherwise agree to much stronger sets of Terms of Use.

Is Facebook likely to have to change how it operates if Biden wins the presidency?

I think it’s worth considering whether Facebook will “want to change”, in addition to considering whether it will have to change under a Biden presidency. For example, some tech companies were heavily involved in the lobbying of the new California Consumer Privacy Protection Act,  or otherwise attempt to introduce some forms of self-regulation, ostensibly as a means to placate lawmakers who would seek to revoke s. 230 or otherwise amend the regulatory landscape.