Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Podcast: Section 230 and the Fight Against Online Abuse, With Elisa D’Amico

Elisa D’Amico, co-founder of the Cyber Civil Rights Legal Project—which provides pro bono legal assistance to Victims of nonconsensual pornography—joins Ellysse and Ashley to explain the unique challenges the Internet poses for combatting online abuse, the struggles victims face seeking justice, and the role online platforms play in moderating abusive content.

Mentioned

  • “Stopping Harmful Image Exploitation and Limiting Distribution (SHIELD) Act of 2021,” Office of Congresswoman Jackie Speier.
 

Related

  • Niraj Chokshi, “How to Fight Back Against Revenge Porn,” The New York Times, May 18, 2017.
  • Matthew Goldstein, “Law Firm Founds Project to Fight ‘Revenge Porn,’” The New York Times, January 29, 2015.
  • Pari Berk, “Elisa D’Amico ’06: Fighting to Protect Sexual Privacy Online, in a Share-and-Share-Alike Culture,” Fordham Law News, June 18, 2019.
  • Elisa D’Amico and Luke Steinberger, “Fighting for Online Privacy with Digital Weaponry: Combating Revenge Pornography,” Entertainment, Arts and Sports Law Journal 26, no. 2 (2015): 24-36.
  • Elisa D’Amico, “The War to Stay Secure: Online Privacy and the Battle in the Civil Courts against Sexual Cyberharassment,” Human Rights 41, no. 3 (2016): 5-8.

Audio Transcript

Elisa D’Amico: People tend to think that it is a harm that only takes place online: it happens on social media or it happens on a website and it’s limited to the computer. But that’s not the case at all.

Ellysse Dick: Welcome to Ellysse and Ashley Break the Internet, a series where we’re exploring the ins and outs of Section 230, the law that has drawn attention to both the incredible potential and the real dangers of online communication. I’m Ellysse Dick, Research Fellow at the Information Technology and Innovation Foundation. We are a tech policy think tank based in Washington, D.C.

Ashley Johnson: And I’m Ashley Johnson. I’m a Research Analyst covering Internet policy at ITIF. In this episode, we’ll be looking at the impact Section 230 of the Communications Decency Act has on victims of online abuse and how many individuals misuse online platforms to spread harmful or illegal content to the detriment of others. Joining us, we have Elisa D'Amico, Partner at K&L Gates and cofounder of the Cyber Civil Rights Legal Project, a global pro bono project empowering and helping victims of nonconsensual pornography. Welcome to the podcast, Elisa.

Elisa D’Amico: Thanks so much for having me.

Ellysse Dick: So, we’ve talked a lot on this podcast about the good and beneficial forms of communication that the Internet facilitates. But as we all know, there’s a lot of really terrible content and activity happening online. What are some of the harms that you know to take place online?

Elisa D’Amico: Oh, boy, that’s a big question. Well, I’ll start by describing what it is that I do. And basically I feel like I work in the bowels of the Internet, that’s how I think about it. And truthfully, I follow technology, I love to hear about innovation—AI, whatever’s being built—but then in order to do my job, I have to think about the worst thing that could possibly happen from using that technology and then prepare for it. So, the things that I most often see that come across my desk are the sharing of intimate photos and videos that were meant to be private or that were taken without permission. Sexual abuse sometimes is filmed and then used as blackmail, so either it’s shared or threatened to be shared. And then, I mean, there’s tons of material, right?

But all of that is typically accompanied by personal identifying information and/or commentary that is meant to embarrass, shame, humiliate, and harm the individual. And so, it’s not only photos of someone, it’s the sharing of that material in a way that is meant to expose them; that is intentionally and purposefully designed to cause them the maximum amount of harm that it could. So, one of the things that I’ve seen—I mean, it’s pretty disgusting—but women are in some sense traded like baseball cards online, and there are requests for sharing of that material. People will get lauded and applauded for sharing it, and it’s done in a way that is without thought about the people who are pictured.

Ellysse Dick: So, how can this type of online abuse impact victims’ lives, and how might these harms that take place online actually follow victims into the real world?

Elisa D’Amico: That’s a great question. I’m actually glad that you asked it. So, most of the time when we talk about Internet-based abuse or technology facilitated harassment—whatever you want to call it—people tend to think that it is a harm that only takes place online: it happens on social media or it happens on a website and it’s limited to the computer. But that’s not the case at all. Think about it this way. The first time that you meet someone—you go on a date, you interview them for a job, or meet someone in a crowd, or find someone interesting, a professor, whatever that might be—you Google them, or use Bing, whatever service that you like, but usually Google.

And when you put their name into Google, what you usually get is a bunch of information about them, information they’ve shared, whether it’s good or bad. And because of that, this type of material is extraordinarily dangerous to someone’s reputation. And that has a direct impact on their economic wellbeing, their emotional wellbeing, and also their physical wellbeing. So, I’ve had clients who have been fired from jobs, been unable to get jobs, been asked to leave schools, removed from specific honors programs, they’ve been unable to get into schools because of what’s happened to them. And additionally, there are many clients who have physical manifestations of this type of harm. So, for example, gaining a lot of weight in an unhealthy way or losing a lot of weight. Some of my clients have had skin conditions that have manifested as part of the trauma, their hair’s falling out, all different sorts of things.

And then most of the victims of nonconsensual pornography actually suffer from PTSD. And so, this is not just an Internet thing, it is very much something that encompasses someone’s whole life. And as I’m sure you both know, once you put something online, it’s very hard to get back. Things spread very quickly and it’s all about supply and demand. So, if there’s a demand for this material, it tends to spread really quickly and sometimes victims just can’t get away from it. So, it’s very unfortunate. And then in the most unfortunate circumstances, the victims have committed suicide and we’ve seen so many of that; Tyler Clementi, of course, the first that comes to mind, but there’ve been so many individuals who’ve lost their lives for something that is just so terribly unfortunate.

Ellysse Dick: I think you make a good point about the permanence of the Internet and that being one of the things that makes these online harms so different from what we might see elsewhere in society and around the world, as far as people inflicting harm on others. Are there any other aspects of Internet harm specifically that you think make these unique, in comparison to abuse that you might see in other instances? And are there ways that platforms and victims and other actors respond to these harms that are different from ones we might see in the “real world”?

Elisa D’Amico: So, another great question. I think, in some sense we’ve discussed it because information just gets spread so rapidly and can go viral, which is something that you think about literally for viruses or for things on the Internet. But because this is an Internet-based crime that we’re focusing on, that means that anyone around the globe who has Internet access can access it. And so the speed at which information travels is incredibly quick and the reach is incredibly broad. And so, when we’re talking about a physical crime, generally it is something that is limited both in terms of scope, in terms of visibility. Before the Internet, if there was a crime, there would be reporting on it in the news, in the media, in a newspaper. Somebody would actually have to wait until the physical newspaper came out to hear about it and maybe they’ll talk with their friends; if it was my grandmother, she’d cut a clip out and mail it to me without any explanation as to what she was doing.

But this is very, very different. It’s a different world that we live in. So, it creates a lot of different problems. One is speed, the other is jurisdictional. So, it’s often really difficult to hold people accountable when those people are not present in your jurisdiction or within your own backyard, within your city, within your state, within your country even. And that presents a big problem for individuals, because if you can’t reach the person on the other side who’s doing this, then you’re really very much at their mercy. And if they don’t stop and if they refuse to stop, victims feel very helpless in what they can do.

And Law Enforcement plays an important role. Unfortunately—and let me first say, there are many law enforcement agents, many of whom I work with that do an extraordinary job in helping victims and trying to move the ball forward and try to make positive change in how victims are treated from a law enforcement perspective—but overwhelmingly, I can’t find the right words, it’s atrocious how victims are treated. I literally can’t tell you how many times someone has come to me and said, “I tried to report to law enforcement, but they told me I shouldn’t have taken the photos in the first place. He took the photos of me so he can do whatever he wants with them. ‘There’s nothing I can do to help you, it’s an Internet thing. It’s already out, there’s nothing we can do.’”

And that’s just not true. It’s not true. It wasn’t true before there were revenge porn laws or nonconsensual pornography laws in place in the states, and it’s not true today. There are things to be done; there are often, though, problems in getting those investigations started. There are problems, though, in getting those investigations started because of this jurisdictional component. So, just to expand on that for a minute, in some jurisdictions—and so let’s say, we’re talking about Miami, where I am—if I’ve got a victim in Miami who has a perpetrator that is, for example, unknown to her, then I need to convince local law enforcement, if I’m trying to go the criminal route, to open up an investigation and to serve subpoenas, or they call them warrants, on, usually, social media platforms or other websites to try to identify the person that posted this information. And sometimes law enforcement doesn’t have the authority to issue those subpoenas because of how their law’s structured. And so, that’s one problem.

The other problem—or another problem—is when you have a perpetrator and a victim in two different jurisdictions. So, whether it’s within one state or in two different states, that becomes problematic as well. Imagine a game of pinball, your victim is the pinball. Victim goes to the local precinct, reports, they say, “Sorry, we can't help. The perpetrator is somewhere else.” Then the victim calls up the other jurisdiction’s law enforcement, “We can’t help you. You don’t live here.” Or, “We can’t help you; it’s on the phone.” So, she drives all the way there or flies all the way there, same thing, “We can’t help you.” Will you talk to the other law enforcement? “I don’t know if we can. I don’t know if we can.” And it’s a mess.

And meanwhile, while this is all happening, you have material that is intimate, that is private, that was meant to be private, or in some cases that wasn’t even consensually filmed, sitting on the Internet. And we know that nothing just sits on the Internet; it’s downloaded, it’s shared, it’s reshared, it’s re-saved, it’s sometimes manipulated. And they’re at a loss and victims are at a loss, and this has been going on for a very long while. And that’s actually why I started doing this work is because I knew that there were victims who were literally just sitting ducks, and waiting, and just had material online and literally didn’t know what to do with it.

Ashley Johnson: So, you mentioned the struggle in terms of holding online abusers accountable. One of those struggles in doing that is the factor of anonymity online, which has been a source of debate for a long time. So, in your mind, what are the pros and cons of letting people remain anonymous online?

Elisa D’Amico: Great question, and a very difficult game of tug-of-war. For one, anonymity is important. People deserve the protection. They deserve the choice to speak out, whether in their own name or in some protected manner. And particularly for marginalized communities, the anonymity is critical. And we see often, when they are de-anonymized, that they’re targeted very heavily and suffer greatly. And so, I think, when we think about free speech and the right to remain anonymous, I think that’s incredibly important. So, if somebody wants to voice their opinion of how they feel about something or speak freely on a topic and be anonymous, I will protect that as much as I can.

The problem becomes, when we’re talking about activity that’s not protected by the First Amendment, that you can’t harass someone, you can’t threaten to kill someone. I mean, all of these things are not protected speech. So, when we’re talking about people who are absolutely and unquestionably violating the law, since that’s not protected, I believe that victims have the cyber civil rights. They have cyber civil rights that we need to protect. And that is, essentially in some sense, the right to hold people accountable for cyber abuses that they direct against you. And so, often, I will take steps, and practitioners who are helping victims of nonconsensual pornography or other online abuse will take steps, to unmask bad actors online to hold them accountable.

And so, there’s a difficult balance that needs to be struck because the last thing that someone in my shoes want to do is to de-anonymize someone where it’s not warranted. But with nonconsensual pornography and online abuse, generally—sometimes, but not always—sometimes it’s very difficult to ascertain what the truth is, because it’s a question of, let’s say, defamation. Without knowing the facts behind what’s happening, we don’t know what’s true and what’s false, what did or did not happen. It’s not as clear as something like CSAM. That’s just completely different; it’s just binary. It either is, or it isn’t, and that’s very much not the case here. So, that becomes very difficult. So, it’s something that is worth thinking about, especially for practicing lawyers—or any individual—who are dealing with this situation. Because de-anonymizing can be a very important tool and the knowledge of how to do that and the forensic capabilities are critical to what I do, but it’s a dangerous weapon in some sense.

Ashley Johnson: So, moving on to the platform side of things. Harms from online abuse can come from a number of places, ranging from websites or services that are actively engaging in illegal or harmful activities to large platforms where this content is mixed in with other largely harmless content. Should we be thinking about and addressing harms from different sources in different ways?

Elisa D’Amico: Well, I certainly think it makes sense to think about the type of intermediary at issue, in terms of how we’re dealing with it. So, for one—and they’ve become less, I don’t want to say less popular, but they’ve become maybe less visible—but several years back, there were a ton of what I used to call “revenge porn websites.” This was before we switched to using the term “nonconsensual pornography.” And those sites were designed to spread this type of material, and it’s not even worth giving them airtime. But it was, “Upload your ex and their photos here, give me their name, give me their school, give me their employer, give me their age, give me every everything you could possibly do, social media handles,” and that type of information gets shared. There’s stuff that’s disgusting, that’s very much legal. And then, I think there’s all of those sites.

So, those sites, I think, need to be dealt with in a very different way than a social media platform where someone is just choosing to upload this material, whether they have the permission to or not. Now, intermediary liability, obviously, that’s why we’re here. Clearly, it’s mainstream now. More people know about Section 230 than they did five years ago. But I don’t know if people really truly understand what holding platforms accountable for something like nonconsensual pornography would look like in that world. So, I think that we certainly need to look at whether the intermediary is involved in the content in some way, is specifically soliciting this type of content, just the same way those revenge porn websites did do.

But all of the platforms and websites have terms of service, and usually community guidelines, and other policies that they’ve put in place that are more limiting than the First Amendment. So, you can’t post pornography. More recently, it was actually for some of the platforms, you can’t post COVID-19 pandemic gear. So, all sorts of different things that obviously are way more limited than the First Amendment. The problem with nonconsensual pornography—there’s many problems, but one of them—is that it’s not like CSAM. You cannot tell just from looking at it, whether it’s actually NCP or not. And in part, that is because of what our society is demanding in terms of pornography.

So, again, I just go back to supply and demand. I’ve got an economics degree, so I tend to go back there a lot. But pornography comes in many shapes and sizes, many different forms. There is a big demand for pornography that looks nonconsensual. And so, because of that, if you’re on the platform side and you’re one of the countless people whose job it is, is to moderate this content, when you’re faced with that material and you’re looking at it, you can’t tell out of context whether it’s consensual or not. Now, it may not be allowed on the platform because of another policy that disallows pornography, but it’s difficult sometimes, and that’s something that we just can’t get away from.

I think the platforms certainly have changed their policies over time, some better than others, in an attempt to try to combat this type of material. And I think the pandemic has been a very difficult time for people trying to remove content. Since a lot of the takedown procedures were automated, they became a lot slower. I got a notice that an account was taken down quite literally a year after I submitted a request to do that. And one thing that I certainly think could be done better—and I know you didn’t exactly ask this, but it’s worth saying anyway, and maybe in the hopes that someone’s listening to me finally—but I’ve said for a while, that I would love for the platforms to team up with the practitioners like myself who have been doing this work for a while, and who have a track record of being very credible and knowledgeable in what they’re doing. Because the systems that are set up with the policies and the procedures within the platforms are imperfect.

And I have contacts there, and I try to tell them where I see little glitches, where you get stuck in the matrix and something’s not working quite right. And there’ve been times when that’s been fixed, but I would like a program where we can fast-track our request for removal. And in part, part of what my team is doing—and I’ve got a lot of lawyers working on this in a number of different countries—is we do the triage. We’re doing that already. We’re doing the work. This is all voluntary, we do it pro bono. K&L Gates allows us to do that and they’ve supported this project since 2014, which is pretty tremendous. But we’re doing all that work, we’re doing that triaging, and I feel like it’s lost. A lot of that work is lost because sometimes it’s just a simple takedown. YouTube has a system where it’s trusted—I forget what it’s called, trusted flaggers system, some sort of similar system—but I’d love to see that. I think it would increase efficiencies all the way around, but haven’t seen it yet. Maybe it’s something to look forward to in the future.

Ellysse Dick: So, can we just take a step back for our listeners who might not be working on this every day and just talk a bit about, what are the remedies that are available to victims of this kind of online abuse, both in terms of going through the platform process and also potential legal remedies? Can you just walk us through what it looks like when something comes across your desk?

Elisa D’Amico: Sure. And the things that come across my desk are, they all look very different. For one, I don’t only do pro bono work. I have a litigation practice where I do lots of crisis management and social media law. So, I deal with takedowns and other issues like copyright and trademark, and those sorts of disputes as well. So, sometimes I get an inquiry through the pro bono project that actually is outside the scope, but I’m still able to help that person. Insofar as the nonconsensual pornography work, most of the time the victims are women; I think it's something like 90 percent of all the victims are women. We see victims of all genders. We see perpetrators of all genders, homosexual, heterosexual relationships, and everything in between.

But typically, and most often, we're not filing lawsuits on behalf of the victims. Generally, of course we ask what the victim wants; I think that’s a question that sometimes we forget to ask. But most times, and I would say overwhelmingly, if the victim had to choose one remedy, it would be to take the stuff offline. Get that stuff offline. So, victims can usually get material removed from the Internet based on either a copyright basis—so the Digital Millennium Copyright Act, the DMCA—because of a platform or website’s policies and protocols that have been agreed to or violated. Sometimes we’ve been able to get the material down because the person who’s working at the website just listens.

Other times, in the cases where it’s sexual abuse that has been filmed or photographed, usually that’s not necessarily a policy, but it’s pulled down pretty quickly. And in other cases, law enforcement is able to enforce the takedowns and that’s wonderful. And it’s very important that we do that so that we can limit the harm to the victim. Sometimes we’re not able to get the material down. Either it’s on the dark web, it’s hosted somewhere and that individual or entity is not responding to a takedown and the hosting provider is not responding, and it’s just out of our reach.

And in those cases, what we do—and I think it’s a forgotten tool—is, you can de-index from search engines. So, both Microsoft and Google came out with policies that said, “Nonconsensual pornography is not permitted.” Essentially, “If you are a victim of nonconsensual pornography, report it to us and we’ll remove that search result from Google.” So, there is a forum to do it. Victims can do it themselves. I will admit, it is not the most intuitive. It’s slightly confusing for some reason because of the wording, but that is one option. And the way that I like to describe it is, we’re on one side of a bridge and the material is sitting on a website on the other side of the bridge. And in most cases—unless you know that exact URL, that exact address where that offending content resides and you want to go that way, or you’re searching on another site for a particular person—most often, you get to that other side by crossing that bridge, by going through Google, by clicking on a search result.

So, if we’re able to remove that search result and then somebody can Google their name and that doesn’t come up, people aren’t looking for it often. So, that is a very, very useful tool, perhaps underutilized or not thought of as much. But that’s something that we use quite often. For victims that live in the EU, they have the right to be forgotten. So, they can request that their results come down from search engines, whether it’s revenge pornography or not. But that’s only for the EU results, not for the global results; but we don’t have that. And so, these policies are very, very important, I think, for victims as well.

Ashley Johnson: So, going more to the Section 230 angle now. Many of the critiques about Section 230 stem from the presence of illegal or harmful third-party content on online platforms. What is your take on critics’ claims that there currently isn’t enough of an incentive for platforms to moderate this type of content? Is Section 230 the problem or are potentially other laws and practices the real problem when it comes to nonconsensual pornography?

Elisa D’Amico: In terms of social media platforms, I don’t think Section 230 is the answer per se. I think that, for example, the SHIELD Act—which is now going to the Senate—looks at whether that intermediary is specifically soliciting this type of content, knowingly and predominantly distributing it, and we’re talking about content that they know is violative of that section of the SHIELD Act, which is the content that we’re talking about. That is very, very specialized. And I think that the SHIELD Act would be very useful, in part, to solve the jurisdictional problem that I was talking about a little bit earlier, which is: when we’re in different states or different jurisdictions, it’s very difficult to hold the individuals who are actually doing the harms responsible, but the platforms are the ones that often have the information that we can use to solve our problems, to figure out who's doing it.

So, I have actually, throughout my pro bono practice, we’ve had to subpoena the platforms over and over again to get that content. For the most part, they’ve been cooperative. In some cases, I know it’s shocking, but the platforms have provided me that information without a subpoena, because it was just so egregious. They’ve cooperated with law enforcement. And so, I think that the way to help this problem with nonconsensual pornography is to train law enforcement a heck of a lot better, to train and incentivize private practice lawyers to take these cases to try to hold people responsible. And I think society on a whole needs to change in order for this to get better.

Because, again, supply and demand. This is what, for whatever reason—I still can’t wrap my brain around it, would love if you guys could explain it to me—but for some reason there are people who live for this thing. They do it for the lulz. They would love nothing more than to humiliate and harm people that they don’t even know. So, I don’t believe that holding the platforms responsible is the answer. Now, I think there are many cases where the platforms could do a lot better. For example, the program that I suggest would, I think, be a great help to everyone. I don’t really see a downside to that. And listening to people who are reporting and reporting problems.

The issue with a lot of the negative content and the, I guess, malicious content that’s online now is that it’s not just nonconsensual pornography. We’re talking a lot of times about defamation. And so, in some cases, or most of those cases, the platform certainly isn’t going to sit down and decide whether or not the person who posted the content or the person who's complaining about the content is telling the truth. That’s a bad place to put the platform in. So, this idea of holding the platforms liable for that content, it doesn’t sit well with me. I would much rather use those efforts to try and educate law enforcement and educate lawyers and educate individuals in the hopes that we can change societal norms towards something that I think we’re all aspiring for.

Ashley Johnson: For our final question, we want to know what your verdict is on Section 230. Should we keep it, amend it, or repeal it?

Elisa D'Amico: Let me answer that question by saying this: I think that the fact that Congress has passed the SHIELD Act is a great step forward for preventing the type of harms that we’ve been talking about today. And I am hopeful that the Senate will do the right thing. I think that intermediary liability is obviously a very contentious topic, but I think intermediaries are part of the answer in fixing what’s broken with the Internet—except for you two, because you two “broke the Internet,” right? But I think that this is a good step forward, and I am really hopeful about how the future looks.

Ashley Johnson: Thanks so much for joining us, Elisa. If you want to hear more from Elisa, you can follow her on Twitter @elisadamico. That's it for this episode. If you liked it, then please be sure to rate us and tell your friends and colleagues just subscribe.

Ellysse Dick: You can find the show notes and sign up for a weekly email newsletter on our website, itif.org. Be sure to follow us on Twitter, Facebook, and LinkedIn too, @ITIFdc.

Twitter Image: 


This post first appeared on ITIF | Information Technology And Innovation Foundation, please read the originial post: here

Share the post

Podcast: Section 230 and the Fight Against Online Abuse, With Elisa D’Amico

×

Subscribe to Itif | Information Technology And Innovation Foundation

Get updates delivered right to your inbox!

Thank you for your subscription

×