Political heat is swirling around the Catch-22 that grants platforms unique legal immunities on content

Political heat is swirling around the Catch-22 that grants platforms unique legal immunities on content

May 2021

Orr was crazy. That meant he didn’t need to fly any more US combat missions in World War II. The rules said all Orr needed do to be grounded was to ask. But, as Doc Daneeka explained to Yossarian, there was a catch. Anyone who wanted to get out of combat wasn’t crazy. “That’s some catch, that Catch-22,” he (Yossarian) observed.[1] With this scene in his bestseller of 1961 of the same name, US author Joseph Heller coined a term to describe dilemmas and absurdities arising from conflicting circumstances.

Historian Niall Ferguson describes ‘CDA 230’ as the most important Catch-22 of the internet age.[2] The letters and numbers are short form for the law that, due to conflicting definitions of the status of platforms, grants social-media networks such as Facebook legal immunity for the content they host and the latitude to decide what content they won’t host.

The Catch-22 came about when US policymakers decided internet-services providers needed protection after a court in 1995 ruled an internet company was liable when a user defamed a bank on its message board.[3] The resulting section 230 of the Communications Decency Act of 1996 cleared the way for the rise of platforms, including Facebook and YouTube, dedicated to user-generated content.[4]

In its most significant part, CDA 230 instructs that platforms are not publishers. This largely absolves them from any harm caused by content posted or shared by users, whereas traditional publishers are responsible for the content they and their users publish offline and online.[5] The other key part of CDA 230 conveys that platforms are publishers.[6] This gives them the right to censor content and bar people they find objectionable.

The greatest threat to the ever-more-influential platforms is Washington repealing the immunity for content because platforms would then need to vet every post they host to ensure they complied with the law and were not defamatory – Facebook, for instance, would have more than three billion users to oversee. Abolishing the ability to remove objectionable content wouldn’t maul the viability of platforms, just encroach on their rights as private entities.

The threat to the CDA 230 immunities ballooned after supporters of former president Donald Trump stormed the Capitol in January to protest the validity of the 2020 election. Apple, Alphabet (owner of Google), Amazon, Facebook, Twitter and others highlighted their quasi-censorship powers under CDA 230 when they banned Trump for “fanning the flames”[7] and disabled some conservative sites such as Parler. At the same time, the incident highlighted their legal immunity for allowing the sharing of misleading information about the election and for hosting groups such as ‘Stop the Steal 2020’.

By the end of April, the number of proposals to amend CDA 230 circulating in Washington stood at about 40.[8] A bill co-sponsored by Democrat Senator Mark Warner, for example, makes it easier for people to seek legal redress if content abuses, discriminates, harasses or threatens physical harm and the platforms take no action. “How can we continue to give this get-out-of-jail card to these platforms?” Warner asks.[9]

The senator would know the answer. When it comes to the restrictions on content, authorities are stymied due to free-speech protections. Policymakers grasp that limiting the protections would maim platforms such as Wikipedia that are not accused of harm. They know it would make it prohibitively expensive for platforms to manage the ratings and reviews that users value that are found on online services such as AllMusic, Amazon and Tripadvisor. Lawmaker calls to abolish the online anonymity that shields trolls succumb too to free-speech protections against the “tyranny of the majority”.[10]

The quandary officials need to solve when it comes to online platforms removing any content they find objectionable is that cyberspace is the modern public square.[11] Platforms determine the line whereby free speech crosses into unacceptable speech – a line that is often arbitrary. Government cannot intervene when Silicon Valley blocks views it might consider ‘hate speech’ that offline is protected free speech that many people might find acceptable.

Two other concerns snooker policymakers. One is that Washington needs Big Platforms to beat back China’s drive to dominate the technologies of tomorrow. The other is that democratically elected politicians are reluctant to sabotage platforms that voters view as essential and harmless when it comes to their use and businesses regard as valuable advertising tools. The likely outcome? Platforms will retain legal immunities not available to others.

The CDA 230 content immunity, of course, is not absolute. Platforms are liable for content they create. They must not abet crime. More laws have appeared to force platforms to remove user content when they have been informed of its illegality. Platforms need to obey content-related laws such as copyright. Traditional media and publishers enable plenty of mischief, for all their legal responsibility. Lawyers argue over the constitutionality of CDA 230 so maybe one day a court, rather than politicians, will torpedo it.[12] It’s true too that CDA 230 has never been more vulnerable because the conservative side of US politics reckons that Big Platforms unfairly silence it, while the liberal side thinks the platforms are not doing enough to remove what it judges to be hate speech.

Even though most politicians are unhappy with the lack of accountability stemming from CDA 230, the left-right disagreement on whether to prioritise fighting fake news or protecting free speech is another impediment to any action because it makes building a consensus harder to achieve. The CDA 230 protection will stay. As will the controversies swirling around the Catch-22.    

Owning up

Personal responsibility has been the cardinal principle of ethics since Aristotle founded analysis along moral lines in ancient times. The concept that people are responsible for the consequences of their behaviour is a cornerstone of legal systems. But there are exceptions. ‘Diminished responsibility’ is an accepted defence in criminal cases for people under duress, the mentally handicapped and the insane. But mostly for all others (allowing for exceptions like diplomats), individuals and companies are responsible for the actions they take.

In this age when social media is so popular and influential, it galls many that no one seems to be responsible for the misinformation that often spreads rapidly and widely in cyberspace. Some have blamed the social networks for intentionally and cynically ignoring misinformation on their platforms, suggesting that higher engagement enables them to sell more ads, a claim that the social networks reject. In any case, viral misinformation has been blamed for everything from genocide to youth suicides; from encouraging anti-vaxxers to fostering the polarisation that the authors of How democracies die say “challenges US democracy”.[13]

Some say the solution is ending CDA 230, come what may to the platforms. Shunning this path, politicians have encouraged the companies to better regulate themselves (which they have done) while tightening content exemptions to the CDA 230 immunity. In 2018, for example, US lawmakers excluded laws against sex trafficking from CDA 230 protections, proposals the platforms initially opposed due to the existentialist threat that overturning CDA 230 poses.[14]

Some of the proposals circulating in Washington nowadays can be grouped into a push to carve out more exceptions to the CDA 230 immunity. Warner’s bill is one example. But it seeks only to remove CDA 230 protections when paid content abuses or seeks to defraud people.

Other lawmaker proposals can be grouped into proposals that seek to enforce requirements that online companies must fulfil certain conditions to gain the immunities. These conditions are generally along the lines that platforms must report and remove criminal-related activities. Facebook CEO Mark Zuckerberg appears to back such measures as part of “thoughtful reform” of section 230. “We believe Congress should consider making platforms’ intermediary liability protection for certain types of unlawful content conditional on companies’ ability to meet best practices to combat the spread of this content,” he said in March in a submission to Congress before a hearing on the events of January 6.[15] 

Some US lawmakers say outlawing anonymity might be another way to improve content on the internet.[16] But this may lead nowhere because courts have previously ruled the US constitution supports anonymity as an “honourable tradition of advocacy and dissent”.[17]

All in all, while there will be incremental reform on transparency on advertising and liability for not removing banned material (terrorism and child pornography), no major limitations are likely to be placed on the content immunity platforms enjoy.

Private control

In 2017, the day after the deadly ‘far-right rally’ at Charlottesville, Matthew Prince, the CEO of US-based internet-service provider Cloudflare, woke up “in a bad mood”. In a memo to staff, Prince recounted that on the rationale that “the people behind the Daily Stormer are assholes” he disabled the website of the white supremacist magazine, as Cloudflare’s terms of service allow. “No one should have that power,” he admitted.[18]

But most privately owned businesses have always had the power to withdraw their services. (Telecoms, for instance, can’t discriminate under common carrier laws, nor can utilities.) It’s just that the right of private entities to control their services matters more on the internet because a few private companies control cyberspace. A smattering of CEOs can thus cancel anyone or censor anything. This power became apparent when, after the Capitol was stormed, Big Platforms blackballed the sitting (even if outgoing) US president and crippled internet communications for many of his supporters by disabling the micro-blogging site Parler.[19] Each platform justified Trump’s suspension on his role in inciting the violence at the Capitol, and the potential for further trouble ahead of the transfer of power to President Joe Biden. Facebook’s Oversight Board, a committee appointed by the company to provide recommendations on content, in May upheld the ban on Trump.[20]

The decisions to electronically silence Trump have startled many from all sides of US politics. The progressive American Civil Liberties Union warned of Big Tech’s “unchecked power to remove people from platforms that have become indispensable for the speech of billions”.[21]

While platforms have the legal right to ‘deplatform’, they added to their political liability that Republicans and their allies eye them as opponents. But there’s little even Republicans back in power could do to ensure their presence on social media or to limit the platform’s power over content decisions.

After all, they couldn’t do much while in power. In 2020 when Twitter attached truth warnings to Trump’s Tweets, he could only respond with a hollow executive order about “preventing online censorship” while calling for CDA 230 to be revoked.[22] The then-Republican-controlled Senate couldn’t do more than stage the political gesture of subpoenaing Big Platform CEOs to appear before the chamber.

Reducing the power of platforms to make unilateral content decisions may be impossible because free-speech legal protections such as the First Amendment in the US are aimed at limiting the reach of governments. When it comes to private entities or individuals, these protections stop the government from forcing them to associate with speech they oppose.

Congress can thus no more oblige Twitter to host the US president du jour than it can compel The Washington Post to publish White House media releases. Who wouldn’t think that would be crazy?

By Michael Collins, Investment Specialist

 

 


[1] Joseph Heller. ‘Catch-22’. Originally published by Jonathan Cape. This exchange is on page 54 of the Corgi edition.

[2] Niall Ferguson. ‘Free speech is in free fall in Silicon Valley.’ 9 June 2019. niallferguson.com/journalism/miscellany/free-speech-is-in-free-fall-in-silicon-valley. See also, Niall Ferguson. ‘The tech supremacy: Silicon Valley can no longer conceal its power.” The Spectator. 16 January 2021. spectator.com.au/2021/01/control-halt-delete/

[3] On the Harvard website, see Stratton Oakmont versus Prodigy Services. Ruling on 24 May 1995. h2o.law.harvard.edu/cases/4540

[4] The immunities have by default applied across the US-controlled internet even though no other country has passed similar laws. See Electronic Frontier Foundation, a lobby group for CDA 230. ‘CDA 230: The most important law protecting internet speech.’ eff.org/issues/cda230

[5] The wording is that no interactive computer service will be ”treated as publisher or speaker of any information provided by another information content provider”. This means they are classed like communication utilities that aren’t responsible for how people use their services. See US Congressional Research Service. ‘Liability for content hosts: An overview of the Communication Decency Act’s Section 230. 6 June 2019. fas.org/sgp/crs/misc/LSB10306.pdf

[6] The wording is that no platform can be “held liable on account of ... any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable”. This means they are unlike communication utilities that can’t ban anyone or interfere with communications. See US Congressional Research Service. Op cit.

[7] Reuters. ‘Trump returns to Twitter as Facebook’s Zuckerberg bans him for ‘fanning the flames’’. 8 January 2021. reuters.com/article/usa-election-socialmedia/update-6-facebook-blocks-trump-zuckerberg-calls-unrest-an-insurrection-idUKL4N2JI39L?edition-redirect=uk

[8] Project Disco. ‘The telecommunications act’s ‘Good Samaritan protection: Section 230.’ Count was at 8 February 2021. project-disco.org/section-230/ 

[9] The Washington Post. ‘Sen. Warner to unveil bill reining in Section 230, seeking to help users fight back against real-world harm.’ 6 February 2021.  washingtonpost.com/technology/2021/02/05/senate-warner-section-230-reform/

[10] See US Supreme Court ruling. ‘McIntyre v. Ohio Elections Comm’n, 514 US 334 (1995). supreme.justia.com/cases/federal/us/514/334/

[11] This term described any location where people could freely discuss issues under government protection.

[12] See Philip Hamburger, professor at Columbia Law School and president of the New Civil Liberties Alliance. ‘The constitution can crack section 230.’ 29 January 2021. wsj.com/articles/the-constitution-can-crack-section-230-11611946851

[13] Steven Levitsky and Daniel Ziblatt. ‘How democracies die.’ 2018. Penguin Books 2019 paperback. Page 204.

[14] One act was the Allow States and Victims to Fight Online Sex Trafficking Act of 2017, commonly known as FOSTA. (The changes mean the platforms can’t use CDA 230 as a defence if user content aids sex trafficking.)

[15] ‘Testimony of Mark Zuckerberg Facebook Inc.’ Hearing before the US House of Representatives Committee on Energy and Commerce. Subcommittees on Consumer Protection & Commerce and Communications & Technology. 25 March 2021. docs.house.gov/meetings/IF/IF16/20210325/111407/HHRG-117-IF16-Wstate-ZuckerbergM-20210325-U1.pdf

[16] Such calls were re-aired after the Capitol was stormed. See Andy Kessler. ‘Online speech wars are here to stay.’ The Wall Street Journal. 24 January 2021. wsj.com/articles/online-speech-wars-are-here-to-stay-11611526491?mod=MorningEditorialReport&mod=djemMER_h

[17] See Techdirt. ‘No, getting rid of anonymity will not fix social media; it will cause more problems.’ 1 February 2021.  techdirt.com/articles/20210131/01114246154/no-getting-rid-anonymity-will-not-fix-social-media-it-will-cause-more-problems.shtml

[18] Daily Stormer. ‘Matthew Prince of Cloudflare admits he killed the internet because he thinks Andrew Anglin is an asshole.’ 17 August 2017. dailystormer.su/matthew-prince-of-cloudflare-admits-he-killed-the-internet-because-he-thinks-andrew-anglin-is-an-asshole/

[19] Practical side effects of the decision include that it forces users onto the ‘dark web’ and beyond the easy monitoring of law enforcement.

[20] Facebook Newsroom. ‘Oversight Board upholds Facebook’s decision to suspend Donald Trump’s accounts.’ 5 May 2021. fb.com/news/2021/05/facebook-oversight-board-decision-trump/. Here’s what former House Speaker Republican New Gingrich thought of the decision. ‘The Facebook Oligarchs’ betrayal of America.’ Newsweek. 8 May 2021. newsweek.com/facebook-oligarchs-betrayal-america-opinion-1589706

[21] Twitter. Tweet by Kim Zetter from the American Civil Liberties Union. 9 January 2021. twitter.com/kimzetter/status/1347733935661305858?lang=en. The angst extended beyond the US. German Chancellor Angela Merkel described the decision as “problematic”.

[22] US government’s Federal Register. Executive order 13925 decreed by President Donald Trump. ‘Preventing online censorship.’ 28 May 2020. federalregister.gov/documents/2020/06/02/2020-12030/preventing-online-censorship

 

 

 

Important Information: This material has been delivered to you by Magellan Asset Management Limited ABN 31 120 593 946 AFS Licence No. 304 301 (‘Magellan’) and has been prepared for general information purposes only and must not be construed as investment advice or as an investment recommendation.  This material does not take into account your investment objectives, financial situation or particular needs. This material does not constitute an offer or inducement to engage in an investment activity nor does it form part of any offer documentation, offer or invitation to purchase, sell or subscribe for interests in any type of investment product or service. You should obtain and consider the relevant Product Disclosure Statement (‘PDS’) and Target Market Determination (‘TMD’) and consider obtaining professional investment advice tailored to your specific circumstances before making a decision about whether to acquire, or continue to hold, the relevant financial product. A copy of the relevant PDS and TMD relating to a Magellan financial product may be obtained by calling +61 2 9235 4888 or by visiting www.magellangroup.com.au.

Past performance is not necessarily indicative of future results and no person guarantees the future performance of any financial product or service, the amount or timing of any return from it, that asset allocations will be met, that it will be able to implement its investment strategy or that its investment objectives will be achieved. This material may contain ‘forward-looking statements’. Actual events or results or the actual performance of a Magellan financial product or service may differ materially from those reflected or contemplated in such forward-looking statements.

This material may include data, research and other information from third party sources. Magellan makes no guarantee that such information is accurate, complete or timely and does not provide any warranties regarding results obtained from its use. This information is subject to change at any time and no person has any responsibility to update any of the information provided in this material.  Statements contained in this material that are not historical facts are based on current expectations, estimates, projections, opinions and beliefs of Magellan. Such statements involve known and unknown risks, uncertainties and other factors, and undue reliance should not be placed thereon. No representation or warranty is made with respect to the accuracy or completeness of any of the information contained in this material. Magellan will not be responsible or liable for any losses arising from your use or reliance upon any part of the information contained in this material.

Any third party trademarks contained herein are the property of their respective owners and Magellan claims no ownership in, nor any affiliation with, such trademarks.   Any third party trademarks that appear in this material are used for information purposes and only to identify the company names or brands of their respective owners. No affiliation, sponsorship or endorsement should be inferred from the use of these trademarks. This material and the information contained within it may not be reproduced, or disclosed, in whole or in part, without the prior written consent of Magellan.

Insights direct to your inbox

Make sense of the global investment landscape with timely updates, articles and videos from our investment experts.

Enter your first name
Enter your last name
Provide a valid email address
Provide a valid postcode
Select type of investor