Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Podcast: Evaluating Proposals to Amend Section 230, With Aaron Mackey

 

Aaron Mackey, staff attorney and free speech expert at the Electronic Frontier Foundation, joins Ellysse and Ashley to evaluate recent proposals to amend or repeal Section 230 based on their potential impact and effectiveness.

Mentioned

  • “S.3398 - EARN IT Act of 2020,” Congress.gov.
  • “S.4534 - Online Freedom and Viewpoint Diversity Act,” Congress.gov.
  • “S.4632 - Online Content Policy Modernization Act,” Congress.gov.
  • “S.4066 - PACT Act,” Congress.gov.
 

Related

  • Ashley Johnson and Daniel Castro, “Proposals to Reform Section 230” (ITIF, February 2021).
  • Sophia Cope, Aaron Mackey, and Andrew Crocker, “The EARN IT Act Violates the Constitution,” EFF, March 31, 2020.
  • Aaron Mackey, “The PACT Act’s Attempt to Help Internet Users Hold Platforms Accountable Will End Up Hurting Online Speakers,” EFF, July 21, 2020.
  • Sophia Cope and Aaron Mackey, “The PACT Act Is Not The Solution To The Problem Of Harmful Online Content,” EFF, July 30, 2020.

Audio Transcript

Aaron Mackey: There’s maybe a high-level agreement among Democrats and Republicans that 230 is some sort of problem, that something should be done, but then what should be done? And they pull in different directions.

Ellysse Dick: Welcome to Ellysse and Ashley Break the Internet, a series where we’re exploring the ins and outs of Section 230, a law that many want to change, but few can agree on how. I’m Ellysse Dick, Research Fellow at the Information Technology and Innovation Foundation. We’re a tech policy think tank based in Washington, D.C.

Ashley Johnson: And I’m Ashley Johnson. I’m a Research Analyst covering Internet policy at ITIF. In this episode, we’ll be examining some of the proposals to repeal, amend, or otherwise change Section 230 of the Communications Decency Act and discuss their potential impact and effectiveness. Joining us, we have Aaron Mackey, staff attorney at the Electronic Frontier Foundation where he works on free speech issues, as well as privacy, government surveillance, and transparency. Welcome to the podcast, Aaron.

Aaron Mackey: Thank you for having me.

Ellysse Dick: So Aaron, to get started, let’s just talk a little bit about the overall regulatory and policy conversation we have going on in the U.S. right now. So Section 230 has been a law for over 20 years. Why are we just now seeing a wave of legislations surrounding it?

Aaron Mackey: Yeah. I don’t know if there’s any one single explanation for why, but I think there are a couple of competing conversations that are happening, particularly with members of Congress, right? I think on the Republican side for the past year or more, you’ve seen a lot of conversations about their claims regarding how Section 230 allows for censorship of conservative voices or conservative ideas. So with the caveat that there’s not actually been any proof of that, that is the push on one side of the aisle. Then I think for longer than that, for the past several years, you’ve heard a lot of voices coming from the Democrats about how do we address things when content is online that is harmful? And this takes many forms. It’s harassment or threats against individuals. There’s often discussions about hate speech. Then more recently, it’s been topics like disinformation and the spread of it online.

So you have a lot of those conversations and they ultimately deal with the choices that online services make and how user-generated content is distributed online, and how the platforms that we all rely on moderate that. So they sort of focus on 230 because that’s the law, but I think as we’ll get to a lot of the sort of proposals or the problems that they’re trying to address don’t actually start with 230. They usually have bigger background issues at play. But that seems to be the focus right now, is that everyone seems to be wanting to target 230 as the cure-all for whatever they see as harmful from what’s happening online.

Ellysse Dick: So as far as specific proposals or amendments that are being suggested, do you see any overarching trends in the proposals that we’ve seen? Are you concerned at all about how those specific proposals might impact free speech online?

Aaron Mackey: Yeah, I think almost every proposal that I’ve seen that’s actually been put into a bill does have consequences for free speech online. I think it can sometimes vary, but I think some of the proposals would do a lot more damage and they come at it from the perspective of not actually even having grounding in basic First Amendment principles or those sorts of things. But I think there are some proposals that start from a different place, a more serious place, that recognize that we’re in a moment online where there are a handful of entrenched large services that we all use—Facebook, Google, YouTube, Twitter—and that the decisions that those platforms make regarding our own user generated content have real, significant, real-world consequences.

So I think starting from that perspective, that there are these few very powerful institutions that have a lot of control over what we speak, is the right way to approach this. But I think all of the proposals that deal with 230 quickly get off into trouble because then they look at trying to change 230 as a means of addressing those problems rather than trying to take on the actual problem, which is that we have very little competition, very little ability for consumers and users to actually control their information online, and other things like that.

Ashley Johnson: A lot of critics see Section 230 as sort of a subsidy for the tech industry, but you have argued that Section 230 also benefits users. What do consumers get out of Section 230 and what would they lose if its liability protections were weakened?

Aaron Mackey: So every Internet user benefits from Section 230. If you have your own web page that you’re not hosting on your own home-built server, if you use an Internet email service that is not your own, if you use any service to learn about businesses, to review products and services online, you are a beneficiary of 230. 230 explicitly says that its protections go to both the Internet service provider and any user of an Internet service. So we have these early cases after Section 230 became law in which individuals had email-forwarded defamatory or allegedly defamatory content, and courts have held that 230 protects those individuals because they weren’t the original creator of that defamatory content.

So I think what gets lost in this is that every entity online that publishes content of a third party is protected by Section 230. That runs from Facebook and YouTube and Twitter all the way down to a local community forum or a niche forum for people who knit. It protects Wikipedia and all the entries there. It protects the Internet Archive, which is sort of our own collective history of the Internet. So there’s just a lot that is protected by 230 and we often are missing that in the conversations, that users not only are the beneficiaries of 230 because we have all these services that we can use every day, but we’re also protected by 230 to the extent we ever operate or use a system where we publish someone else’s content.

Ellysse Dick: So before we get into some specific proposals, in general, what considerations do you think policymakers should prioritize when drafting and evaluating proposals to amend or repeal 230?

Aaron Mackey: Yeah, I think big picture, it has to start with what is the problem that we’re trying to address and how do we address that in a way that is consistent with, I’d say, basic constitutional understandings of, and I think starting with the First Amendment, but as we’ll talk about with the EARN IT Act, also some Fourth Amendment issues as well. But I think it’s actually, the conversation should start not within just sort of narrowly focused on 230, but it’s like, what do we want to accomplish? What specific harm are we trying to avoid? Congress, I think, needs to do the legwork here of developing a record and an investigation, which they’ve done in the past, to understand what’s actually happening online and to legislate from the basis of facts and not the political posturing that we so often see.

Ashley Johnson: Moving onto the specifics. Unfortunately, if we covered every proposal to amend or repeal Section 230, we’d be here all day.

Aaron Mackey: Yes.

Ashley Johnson: So I’d like to start by highlighting some of the more prominent and recent examples. So first, let’s talk about the EARN IT Act. The bill’s original text would require Internet companies to take reasonable measures to combat child sexual abuse material or they would lose Section 230 protections. There was significant backlash from the tech community and from privacy advocates for how this would weaken encryption. Can you explain how a bill amending Section 230 would have anything to do with encryption and why the original language of the EARN IT Act was so problematic?

Aaron Mackey: Sure. So I think it’s important for everyone to understand that even apart from the EARN IT Act, there’s nothing today in Section 230—and there never has been—that prevents the federal government from initiating and prosecuting a criminal case against online services that knowingly receive or transmit this type of abusive material online, and that this type of material is not protected by the First Amendment so it’s outside of the Constitution’s protections. So I think it’s just important for people to understand that the government, regardless of what happens to 230, can today or at any point in the past could have begun an investigation into these types of crimes. I think there are questions about resources and emphasis and priorities within the Department of Justice, but I just hope that people understand that first and foremost.

Now, as to what the original EARN IT Act would have done, although its proponents continue to point out that the EARN IT Act does not mention encryption at all in this text, that the byproduct would have been to put platforms into an untenable position where they would have had to choose to have protections of Section 230 or compromise users’ privacy, likely by breaking encryption. So the way this was accomplished in the original text is that the EARN IT Act created a committee that was basically chaired by the Attorney General and was heavily focused and dominated by members of the law enforcement community, and they were going to come up with a set of best practices designed to protect children and prevent the distribution of this abusive material online.

Those best practices would feed into the test for platforms under a new carveout to Section 230. So to get Section 230’s protection, these platforms would have had to comply with those best practices and had they not complied with those best practices, in addition to being potentially criminally liable from the federal government, they would have been open to civil suit by any individual, as well as I think states’ attorneys general that could have brought claims on behalf of their residents claiming that their policies harmed individuals as a result of the distribution of this material.

So the likely consequence of that commission, it was clear as day, was going to be, to require these platforms to really actually not provide privacy or communication for the types of communications or any sort of content that is posted or shared online. So what this meant was that the platforms were going to have some way of basically seeing into all of our communications, whether that’s email, whether that’s direct messages on your favorite social media network, or anything like that. So that was the end result, was going to be basically say, face crushing civil liability or give government basically a listening ear into every conversation we have online.

Ashley Johnson: So do you see any lingering problems with the current version of the EARN IT Act, which was amended to pose less of a direct threat to strong encryption?

Aaron Mackey: Yeah. The underlying problem still remains. What happens is rather than shifting the best practices to a federal committee that’s created under the bill, what it does is it shifts the best practices to states. So now, 50 states under EARN IT Act can come up with their own civil bills and their own set of best practices that would potentially allow them to bring claims either under civil law or under state law against providers who haven’t met those best practices. So in addition to having to comply now with a patchwork of 50 different states on best practices, what would result is the lowest common denominator, that the most intrusive best practice created by a state would apply to every online provider. So the underlying problem still remains that what EARN IT Act tries to do is again, to leverage the protections that 230 provides from civil suits into some sort of intrusive practice by the state, and here it would be the 50 states, into our own individual privacy.

Ashley Johnson: Moving on to another proposal to amend Section 230, the PACT Act, which would, among other provisions, require online platforms to remove illegal content within 24 hours of receiving a notice or they would lose Section 230 protections. What potential problems with this notice-and-takedown requirement pose for platforms and their users, and are there any other potential problems that you see with the current text of the PACT Act?

Aaron Mackey: So to start, I think the PACT Act, out of all the bills that I’ve reviewed, is probably the most serious effort to actually understand, that comes to the problem of dominance of online platforms and recognizes that there is a fundamental imbalance between the average user and the decisions made about your content online. So from that premise, it starts with a very good idea and it tries to build on a number of other general good ideas about how platforms should be more accountable and transparent about these decisions. But specifically as to the notice-and-takedown, I think in a more simple world, the idea is noncontroversial, which is that if there’s a court order that says this particular post online is illegal, say it’s defamatory, or is otherwise not deserving of protection, there’s a good reason why that shouldn’t be online and why someone should be able to provide notice and have the provider take it down.

There are a couple of problems with that. The first is that it’s actually very difficult to actually, on the platforms’ end or even publicly, to determine when something is a final order. So in the course of a case, you launch your lawsuit, it’s a complaint that alleges that the material was illegal, maybe you move for a preliminary injunction in which there’s an early look at that, there’s some decision made, there might be individual decisions along the way, but ultimately, a particular piece of content is usually not found to be actionable or otherwise illegal until there’s a jury trial or a final determination. So it’s really hard, I think, from the platforms and just looking at the language that was in the PACT Act for them to actually be able to determine what happens.

The other thing we know that happens is a lot of individuals who don't like information online about them will try to remove that content by creating these fake either default judgements or pursue other means in which they’re able to obtain an order where they actually maybe never serve the defendant, or we’ve seen cases in which the defendants are actually working with the plaintiffs to default solely for the purpose of having an order entered that this material was removed online. So how does a platform deal with that problem? The PACT Act doesn’t really address that issue. Then there’s a notice-and-takedown regime in the PACT Act, but there’s no protections for the user or the platform if this sort of abuse that we’ve just described has happened.

There’s no incentive for the platform under the PACT Act to take any step to basically stick its neck out and say, “We think that this information is important. We think that it should remain online and we don’t think that it’s illegal.” We know in other notice-and-takedown regimes, particularly in copyright law, that there’s already a strong incentive upon receiving a notice to remove the information. We know from copyright law, there is a provision that’s supposed to be designed to prevent abuse, but that hasn’t really been meaningfully enforced by the courts. So what we see is a regime of over-censorship in the copyright space. So to the extent that that same sort of scheme is imported into basically all speech online, we’re going to see a lot more over-censorship because the platform is going to have absolutely no incentive to stick its neck out for a particular piece of content and instead, the result is going to be that they’re just going to want to remove that information from the Internet.

Ashley Johnson: Great. Well said. And more generally, what are the dangers of removing liability protections or weakening them for platforms when their users post harmful content or allegedly harmful content?

Aaron Mackey: So there’s very specific consequences for any individual who is targeted for this. We know in the First Amendment, we talk a lot about a concept known a heckler’s veto, which is that one individual is able to exhibit outsized power and silence others’ lawful speech. So without 230’s protections, that will happen where a lot of these regimes look at trying to go back to a pre-230 era of enforcing liability where someone either directly published it, so there’s liability just like you if you're a newspaper and you publish something, or there’s this distributor liability where if you knew about the specific content or you should have known about the specific content, you could be held liable.

So in a notice-and-takedown regime, the incentive is just going to be “notice-and-stay-down” because there’s basically going to be no incentive for these platforms to litigate every user’s claim that says, “Wait. They just targeted my speech because they don’t like it. It’s actually lawful.” But without 230, why would any platform want to litigate, sometimes through summary judgment, thousands of cases like this that occur? And these decisions occur every single day. I think more broadly, and taking a step back from specific content being targeted, without 230, in addition to platforms being more censorious and willing to remove stuff that is otherwise legitimate, we’re also going to not see the types of growth and development and innovation in the space for user-generated content.

So what we’ve seen is a history, particularly, in the U.S., it has been innovation in the way in which we all share and communicate online. In the nineties, we had message boards. We had the walled gardens of AOL and Prodigy. Then in the 2000s, we had this explosion of social media sharing different types of content like video. Now, social media has dominated our Internet experiences. If we deprive ourselves of 230’s protections, there’s not going to be any new competition or innovation because the platforms that are now so big can probably afford to throw lawyers and money at these problems, but those who are trying to innovate and compete with them and to offer new products and services are not going to be able to be in the same position because they won't have 230’s protections.

Ashley Johnson: Finally, I want to ask about two very similar bills, the Online Freedom and Viewpoint Diversity Act and the Online Content Policy Modernization Act. Responding to complaints that major tech companies are allegedly biased against conservative viewpoints, both bills would restrict the types of content that online platforms could remove without potentially facing liability. What’s your take on these allegations of censorship and what are the potential free speech implications of these two bills?

Aaron Mackey: I think my take is that censorship of disfavored views is a legitimate problem. We see powerful voices silencing disempowered communities. But to my knowledge, there has not been any showing or evidence that this happens with respect to conservative viewpoints. What this often happens to is to those who are calling for social change, those who are already politically disempowered or who are seen as socially not having political power. So I think first and foremost, there’s no evidence to support that. But when it comes to what these bills try to do, specifically what they’re trying to do is to change a protection in 230. So oftentimes, we’re talking about the protection in 230 that prevents an online platform or a user of that platform for being treated as the publisher of someone else’s speech for purposes of liability.

But what they’re trying to change with these bills is a second immunity in which 230 protects a platform’s decision to moderate, to edit or delete a user’s content for whatever reason they choose so long as it’s being done in good faith. That protects them from civil liability from the user whose content was subject to the moderation by the platform. So what they’re trying to do is they’re trying to frame it as “platforms will now no longer have legal protections to remove any content that is constitutionally protected.” So the only material that you could actually remove is the material that falls outside of the First Amendment; so obscenity, defamation, fighting words, direct incitement to violence, and so on.

The problem with that is that, I think as a practical problem, there’s a lot of content that is protected by the First Amendment, but that none of us want to see. That includes spam. That includes bot after bot after bot online trying to sell you a product or trying to scam you into something. If platforms can’t remove that information, then these platforms and services just basically become unusable. Then there’s also just a lot of people don’t want to see things that are protected speech. A lot of people don’t like to see people using racially charged language, slurs, hate speech, threatening language and attacks on individuals that don’t rise to the level of, say, a true threat or harassment that would be unprotected. So these platforms have developed their own policies and views about what type of speech they want to allow, and they go far beyond what is speech that’s protected by the First Amendment.

Now, we can debate whether they get that right, generally, and their policies are meaningful. Then we can also debate whether they enforce them in a way that’s fair and equitable. Those are legitimate conversations, but I don’t think the way to address that is to then say you can’t remove any user generated content unless it’s unprotected. The other thing that the bill doesn’t answer is: If that protection goes away tomorrow, what is the legal claim against these online platforms for removing this content? Because at the end of the day, we know that these platforms are private entities, they’re protected by the First Amendment, and they have First Amendment rights to determine what types of content they host.

So just as you do not have a legal claim against a newspaper if it doesn’t run your letter to the editor or the op-ed that you’ve submitted, you similarly don’t have a claim against YouTube, Facebook, or Twitter if they decide to remove a specific piece of content or if they decide to delete your account. Now that’s separate and apart from the question that that might be very harmful to your own business, your own political organizing, and so on, but this bill doesn’t really answer that question. So what it does do is it would just open the floodgates to a lot of vexatious and frivolous litigation on claims of being censored that don’t actually address the fundamental problem here of a handful of entities being able to decide how we speak and doing it in ways that are sometimes unfair and arbitrary.

Ellysse Dick: So Aaron, overall, what do you think these proposals get right, and what do they get wrong when it comes to 230 and intermediary liability? Are we just talking in circles or are we working toward a viable solution at this point?

Aaron Mackey: Yeah, it’s interesting. I don’t know if the metaphor is whether we’re talking in circles or whether we’re pulling in different directions because like I said at the beginning, there’s maybe a high-level agreement among Democrats and Republicans that 230 is some sort of problem, that something should be done, but then what should be done? And they pull in different directions. As we just talked about, with the Online Content Policy Modernization Act, that direction is “don’t let platforms moderate.” They’d basically be legally liable if they moderated any user-generated content that was protected by the First Amendment. So basically, leave more stuff up. Then what you hear from a lot of folks who are particularly focused in the last four years on disinformation and the spread of harmful information online is we need to incentivize these platforms to better moderate content, to take more stuff down, to remove things that are constitutionally protected, whether that’s lies that are spread on the Internet or whether that’s hate speech and so on.

So when you have these conversations that are occurring, I’m not sure which way they lead and there’s not really middle ground when everyone’s pointing at 230 but then the solutions would require sort of completely different policy objectives. I will say again, of the bills we’ve seen, what gets the most right I think is the bipartisan PACT Act to the extent that it recognizes that there are a handful of large platforms that have outsized power to determine how we speak and share and organize online. It recognizes and it tries to set out a recognition of rules specifically designed to deal with these very large platforms and more flexibility designed to allow for smaller platforms and services to compete and not potentially be burdened by the same level of regulation.

But I think again, as we discussed, the implementation of that all goes wrong. I think there could be more work done to actually understand how much content there is online, how quickly content moderators have to review this material and complaints about it, how users actually navigate these processes, and things like that. So I’d say of all of them, the PACT Act at least comes from the most honest position of trying to help users and trying to correct the imbalance between the user and these large services. But I think that’s where the end is in terms of what they get right. What they get wrong, obviously, is that they don’t fundamentally grapple with larger constitutional issues at play. We talked a lot about the First Amendment, but the EARN IT Act, one of the things that’s problematic is that there’s a body of case law under the Fourth Amendment that deals with, when does a private entity become a state actor for purposes of being at the direction of law enforcement?

So the EARN IT Act raises the significant likelihood that if online platforms are compelled to look into everyone’s messages and then turn over that information to law enforcement, that they’re no longer acting as private actors, but they’re actually acting as agents of the government. So there’s a lot of case law that deals with, what happens when my neighbor asked me to water his garden and I go over and get the hose out of the garage and I noticed something illegal and it bothers me, and so I called the police? That’s a private search, but the difference is what happens if the government is telling me that I have to go into my neighbor’s house and look for illegal material and then tell them about it on penalty of myself being subject to liability? Then that starts to look a lot more like I’m being directed by the government.

So when we’re not dealing with these fundamental issues, and they’re not insurmountable, but they just have to be part of the conversation. So if you’re starting with trying to address online content and you’re focused solely on 230 and you’re not talking about the First Amendment, and with EARN IT, you’re not talking about the Fourth Amendment, then I think you’re legislating from a deficient position and you’re not really thinking through how these policies would be effective, including whether your statute actually ends up being enforceable as opposed to being struck down.

Ashley Johnson: For our final question that we will ask all of our guests, we want to know what your verdict is on Section 230. Should we keep it, amend it, or repeal it?

Aaron Mackey: So my verdict is that we should keep it. I don’t ever want to say never, that we would never want to look at it. There are legitimate problems like we’ve discussed that are worth Congress digging into and actually understanding and trying to develop solutions to, but I think those solutions tend to lie outside of amending 230. They deal with competition and antitrust. They deal with consumer privacy and the ability for us to control our information online, and also our ability to take our networks and go to a different service.

So all of those sorts of issues are out there and can be addressed by Congress without amending 230. But ultimately, the reason why I believe in 230 is because, like I said at the beginning, 230 protects every Internet user and has allowed us all to use the fundamental architecture of the Internet. We can build our own website for our own business, our own personal website. We use our email and other social media every day to connect with our friends and our loved ones, and these services are ultimately protected and allowed to create these types of spaces for us to use online.

Ashley Johnson: Very well said. Thanks so much for joining us, Aaron. If you want to hear more from Aaron about Section 230 and all the other tech policy issues he works on, you can follow him on Twitter @aaron_d_mackey.

That's it for this episode. If you liked it, then please be sure to rate us and tell friends and colleagues to subscribe.

Ellysse Dick: You can find the show notes and sign up for our weekly email newsletter on our website, ITIF.org. Be sure to follow us on Twitter, Facebook, and LinkedIn too, @ITIFdc.

Twitter Image: 


This post first appeared on ITIF | Information Technology And Innovation Foundation, please read the originial post: here

Share the post

Podcast: Evaluating Proposals to Amend Section 230, With Aaron Mackey

×

Subscribe to Itif | Information Technology And Innovation Foundation

Get updates delivered right to your inbox!

Thank you for your subscription

×