5 Things You Need to Know About Section 230

1.png

We’re breaking down what you need to know about Section 230..

  1. We've been protecting content distributors from liability WAY before there was an internet.

  2. Section 230 functions as both a sword and shield and neutrality is not required.

  3. Changes to Section 230 would never be easy but the size of the tech giants now make any reform even more difficult.

  4. Europe is working on their own framework.

  5. Regulating the internet is difficult and Section 230 is not our only tool.

Thank you for being a part of our community! We couldn't do what we do without you. To become a financial supporter of the show, please visit our Patreon page, subscribe to our Premium content on Apple Podcasts Subscriptions, purchase a copy of our book, I Think You're Wrong (But I'm Listening), or share the word about our work in your own circles. Follow us on Instagram, Twitter, and Facebook for our real time reactions to breaking news, GIF news threads, and personal content. To purchase Pantsuit Politics merchandise, check out our TeePublic store and our branded tumblers available in partnership with Stealth Steel Designs. To read along with us, join our Extra Credit Book Club subscription. You can find information and links for all our sponsors on our website.

Episode Resources

Transcript

Beth [00:00:00] I think everybody knows that there is a problem here. It's just hard to figure out how to fix it, and the First Amendment is a really difficult piece of this discussion. 

Sarah [00:00:16] This is Sarah. 

Beth [00:00:17] And Beth. 

Sarah [00:00:18] You're listening to Pantsuit Politics. 

Beth [00:00:20] The home of grace-filled political conversations. 

Beth [00:00:43] Hello, everyone, thank you so much for joining us for another episode of Pantsuit Politics. Today, we're going to do a five things episode because as we've talked about before, we really prioritize curiosity here and there's a lot of discussion on and off about what's happening on social media and how social media should be regulated. And at the center of this discussion is Section 230 of the Communications Decency Act. Now we know that sounds really exciting, but we're going to have a conversation about it today that hits on some of the most important issues that we think about as citizens, as parents, as people who love people who have dropped their basket on social media. And so we hope that these five things will help you as you are processing all of this discussion. Before we get started. If this episode is valuable to you and we hope it is, we would love for you to say so in the Apple Podcasts Player. Just go through and leave a quick review for us. It helps other people find Pantsuit Politics, and we are very grateful. 

Sarah [00:01:50] So the first thing we want you to know about Section 230 is it's not new. We have been protecting content distributors from liability way before there was ever an internet. In 1959, the Supreme Court ruled in Smith v. California that a city ordinance forbidding the possession of obscene material didn't apply to a bookstore. The bookstore was a content distributor, not a publisher, a content distributor, and the court worried that if bookstores or other content distributors had to be responsible for all the content they distributed, liability wise, that would restrict access in a way that would restrict all speech that it would have that chilling effect. 

Beth [00:02:32] And that is the central debate about people who have sites on the internet. What are those sites and what are those responsibilities? When we're talking about Section 230 we're talking about what can Facebook, Twitter, any other site on social media be sued for? So on the internet, we have these service providers who are distributing content just like that bookstore. Some of them are acting neutral. Whatever comes in goes out. And some of them, like early service provider Prodigy, employed a team of content moderators. When they got sued, the court said, Well, you're doing some moderation. And so that makes you a publisher, which means that you are on the hook legally for what you have out there in the world. 

Sarah [00:03:18] And so then we're up against what they were talking about with the bookstores. OK, well, now that you've asked them to moderate, which is in theory, something we want people to do to maintain decency, you're going to hear the word decency in the act itself. And then we're going to talk about what happens to those provisions later. But there was a lot of concern. OK, well, now we're punishing providers for what we wanted them to do and which the federal government can't do under the First Amendment. The First Amendment protects speech period. So anything that the court's going to see the federal government regulating that could have effect on speech, they're going to strike it down. They're going to say you're regulating speech and you can't do that. So in the Communications Decency Act right there in the name they started by trying to criminalize the known transmission of obscene or indecent messages to any recipient under 18. But again, because of the First Amendment, the Supreme Court struck down the anti indecency provisions, and it left only the 26 words that made the internet. That's what people call it. Section 230. No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. 

Beth [00:04:27] So the result of this is the second thing we want you to know. Section 230 is both a sword and a shield, and neutrality is not required. So if I'm a platform, I am shielded from most liability. Most people cannot take me into court and sue me for what someone else has published through my site, and I get to moderate content. I get that sword. I do not have to be neutral and ensure that there are exactly as many Republican posts as Democratic posts, for example, on my site. 

Sarah [00:05:01] This lack of neutrality, this lack of a requirement of neutrality, is at the center point of many conservative criticisms. It's what we heard when the platforms banned Donald Trump, Ted Cruz, Josh Hawley. Now, some of them, I would argue in bad faith, were deliberately misreading the law and arguing that the law requires neutrality. It does not. It does not require neutrality or else it wouldn't have the sword or else would be back in that prodigy position that the second somebody moderated content, they'd lose liability. Well they've been moderating content for nigh on decades since the passage of Section 230. So it definitely doesn't require neutrality. But what they're arguing, some of them, is that it should be. That in order to receive that immunity, that we should require content neutrality. That if they moderate, they do need to make sure that conservatives aren't being punished. That's obviously the concern of the conservative critics. And so what many conservative reforms for Section 230 are proposing, or is either a repeal or a requirement of neutrality in order to receive that liability protection? So what's interesting is that the legal challenges to Section 230 after it was passed have come not from what is required of providers, not this question of like should they have to be neutral, but who gets the protection to begin with, who is a provider. So for the first decade or so, providers were granted near complete immunity. Everybody would say if you came to court and you said, I'm a provider, I deserve immunity. The courts are like, Cool, you got it. Then in 2008, you have Fair Housing Council of San Fernando Valley v. Roommates.com and the Ninth Circuit ruled that since Roommates.com required a profile, had a profile system, that it was an information content provider and now a publisher, and therefore did not receive the immunity of a provider. So that's interesting. There's lots of sites that have a profile system. I can think of a couple and then we start to see that chipping away. We see lots more cases where sites came and said, I'm a service provider, I get immunity. And the court said, No, no, you don't. You're not a service provider. And so that's where we start to see changes to Section 230. And then we have a big change come from Congress itself. 

Beth [00:07:21] Because Congress continues to worry about the original sins of the internet around sex trafficking, child pornography, etc. So Congress passes,  

Sarah [00:07:32] Rightly so. Yes, rightly so. That's a good concern.

Beth [00:07:34] That is an excellent concern for Congress to worry about. So Congress passes the allow states and victims to fight Online Sex Trafficking Act and the Stop Enabling Sex Traffickers Act. These two acts together remove that immunity that Section 230 provides so immunity again, meaning you can't be sued for services that knowingly facilitate or support sex trafficking. So this law was signed into law by President Trump in 2018, and it has been criticized by both sex workers and free internet advocates for pushing this content further into the dark corners of the web and for eroding Section 230. People are worried that if you start chipping away at this immunity, it's not going to be long before everyone is in court for everything. The Internet Association was initially opposed to these acts, but then came out in support right after the tech giants testified about misinformation in the 2016 election. 

Sarah [00:08:34] Do we think that's a coincidence? 

Beth [00:08:37] Probably not. 

Sarah [00:08:38] Yeah. 

Sarah [00:08:39] And then you have criticism from the other end of the political spectrum that this shield was supposed to empower them to use the sword more aggressively and that basically you're not using the sword enough. That the social media platforms in particular have not done enough to combat hate speech, have not done enough to combat misinformation. They were getting grilled really hard in those hearings for their role in the 2016 election. And so you have conservatives, including former President Trump, calling for a repeal of Section 230. And then you had President Biden during the 2020 election calling for a repeal of Section 230 during his campaign. And you have other critics on the left as well. 

Beth [00:09:16] So this is a tricky place that we land where people of both parties don't like this law, but they cannot agree on what to do next. And this is the third thing that we want you to know. Fixing this is not easy. Changes to Section 230 would be difficult in any environment, but as tech companies get bigger and bigger, that makes any reform even harder. 

Sarah [00:09:39] So anything, as we previously mentioned, with a whiff of regulating speech, is going to get struck down by the court. Look, this is the hard part. Misinformation is constitutionally protected speech. I know nobody wants to hear that. But it is the reality. Free speech is free speech. Now, if we want to have a conversation about that, I suggest we have a conversation about the First Amendment because changing Section 230 only to get it struck down by the Supreme Court as regulating speech is not going to get us very far. Plus, you have this issue of any increased regulation if it makes it past the courts really runs the risk of just entrenching these already big players because guess who's going to have the money and resources to comply in a complex regulatory environment and to fight lawsuits? 

Beth [00:10:26] That would be Facebook. Mm-Hmm. Mm-Hmm. That's why you see Facebook ads a lot supporting meaningful internet regulation. Facebook can afford to comply, and Facebook knows that they need some kind of regulatory scheme. That's why they've been building their own. Facebook is spending lots of money on basically its own judicial system that operates within the company. And so I think everybody knows that there is a problem here. It's just hard to figure out how to fix it. And the First Amendment is a really difficult piece of this discussion because if you are just talking about what's happening inside a social media platform, the First Amendment doesn't apply. Those are private companies. They can have whatever they want up on their sites. Where the First Amendment gets implicated is if Congress has given this protection against lawsuits and it starts to mess with that protection in ways that are regulations of speech. The First Amendment comes into the calculus, and that's it's hard to remember when and where the First Amendment applies and where it doesn't. And I think that's important to tease out as you're listening to people talk about Section 230. It feels to me like Section 230 has become just an easy way to get around the difficulty of navigating the First Amendment, the way you were talking about Sarah. 

Sarah [00:11:51] Well, because the other thing to remember is that they're private companies, they can regulate speech. The idea that their content moderation, even if not neutral, is some sort of violation of free speech is ludicrous. They're private companies. They can do what they want. But there seems to be the push, and I think especially on the left of like this quid pro quo will lay off the regulatory environment in other areas if you dial up your content moderation because we can't do it, we the government can't do it, but you can. And the Supreme Court would not go for that either. It's not like they can't see the forest through the trees. They're going to strike that down, too. 

Beth [00:12:23] I think another difficult line here is the difference between content moderation and curation, because if you take it back to the bookstore days, bookstores were never in the position that internet service providers are in in terms of the volume of information coming through. But bookstores which were found not to be publishers were absolutely doing some equivalent of content moderation by deciding what they were going to sell and not having any requirement on neutrality in terms of what they were and weren't going to sell. So we kind of lose the plot and some of these conversations because of the technology. But these are issues that we've struggled with, you know, far before the internet existed. 

Sarah [00:13:06] And we are not the only ones. And that's the fourth thing we want you to know. Europe is out there chipping away at this problem. The European Union is out there working hard on how to regulate tech companies. So currently the European Union has the e-commerce directive and the ECD contains liability exemptions notice and take down. It works very similarly to how Section 230 works, but they have a new proposal. It is the Digital Services Act that goes beyond merely updating liability protections, but really balances this need to protect users from online harms such as disinformation and child pornography and counterfeit goods. So they're trying to bring in these other concerns. But they don't have the First Amendment. That's an important thing to remember, and they're still struggling with this issue of the size of some of these companies. 

Beth [00:14:02] So the Digital Services Act attempts to define very large platforms as platforms that have more than 45 million users or 10 percent of the population, and it puts them under additional regulations so they have to disclose their algorithms and submit reports on content moderation. And there is a virtual complaint system for those platforms. 

Sarah [00:14:25] I think that part's the most fascinating the idea that they would have to disclose their algorithm, you know, if anybody who's ever like been put in a Facebook jail or been harassed, I mean, you especially remember in the beginning days, like you were just out of luck if you needed to contact Facebook about an issue. Godspeed. And it hasn't gotten much better. And so I think this idea of like there's a trigger of additional requirements as opposed to putting all the requirements in place and punishing the smaller players makes a lot of sense. 

Beth [00:14:52] So they're working towards greater transparency and being more aggressive in protecting data in order to qualify for that liability protection. And I do really like that part. Last week, members of the parliament's Internal Market and Consumer Protection Committee discussed the changes they want, and there were over a thousand amendments proposed. Don't be intimidated, guys. They can do it. 

Sarah [00:15:13] They can do. I don't know how they do that. And also look, the tech giants are lobbying hard against this level of regulation, just like they lobbied hard against the privacy protections and some of the security protections they've already put in place because Europe is leading the way on this stuff. That is what we're looking at. Europe is leading the way on regulating social media and these huge tech giants.

Beth [00:15:39] Which is a good reminder for us. This is the fifth thing that we want you to know. Section 230 is not our only tool to regulate the internet. We started out by talking about how internet distributors are similar. Now let's talk about their differences. 

Beth [00:16:10] We started out by talking about how internet distributors are the same as bookstores, but let's think about how they're different and what makes regulating them harder. They have infinite capacity. You know, bookstores, even cable news channels have a limited bandwidth in a way that social media platforms run by algorithms just don't. 

Sarah [00:16:30] And, you know, moderation is technically possible to a certain extent. But the idea that we would have editorial oversight in the way that we're used to isn't, and I think that's the really messy overlap, too, is we're talking about content, but then we're also talking about news and we're talking about places where news content is consumed and we're used to environments around news consumption, where we are choosing where we have some control. We can watch this network or that network. We can pick up this newspaper or not this newspaper. But we have almost no control over what shows up in our feeds on social media channels and for better or for worse that is a main news source for many Americans. 

Beth [00:17:14] So when we talk about the platforms, are our main concerns around content or about the size of these platforms? A lot of critics point out that it's really the business model marketing in our data and then using anti-competitive practices to grow. That is the issue, I think. I think so often about Dr. Shoshana Zuboff and how she writes about epistemic inequality. This idea that there is now this enormous knowledge gap between us and these platforms that we're engaged in all the time, and that is definitely where Europe started with its regulation, with the business model itself. And it seems to be where an increasing amount of focus is happening in the U.S. Congress. 

Sarah [00:17:58] Yeah, I am happy to see more conversations about monopolies and more conversation about data security and privacy and less conversation about Section 230. We have a bipartisan coalition of House lawmakers that are pushing a major package of antitrust bills that focus squarely on the tech giants themselves, not the internet overall, not content providers overall. But let's talk about these companies. They're not upstarts anymore. Right? We're not. We're not harming innovation by going after Google, Apple, Facebook and Amazon. So they have a big package of legislation. We have the Ending Platform Monopolies Act, which would make it an unlawful for a platform with at least 50 million monthly active U.S. users and a market cap over 600 billion to own or operate a business that presents a clear conflict of interest. So there we see what Europe is doing. Well, let's figure out a way to set these apart. So there were not built in a regulatory environment that punishes the small guys, but that one that is targeted at the tech giants. 

Beth [00:19:00] There's also conversation about the American Choice and Innovation Online Act. This would prohibit dominant platforms from giving their own products and services advantages over those of competitors on the platform. This comes up a lot with Google and a lot with Amazon in particular. So lawmakers are trying to say, you can't say you're just a platform, but then compete with everybody else within your platform and compete unfairly by putting your stuff in front of consumers in the biggest, boldest letters. 

Sarah [00:19:33] Well, because that's, you know, that's like one of the primary business models. We'll buy the technology that's a competitor and will gobble it up and then it will be a part of us. You see that with Instagram. You see that with Snapchat, you see that with WhatsApp. You also have another piece of legislation called the Platform Competition Opportunity Act, which would shift the burden of proof in these merger cases, where you're seeing them merge with another huge tech company that these acquisitions aren't in fact lawful. So they'd have to prove that the dominant platform would have to prove that instead of the government having to prove that it will lessen competition. 

Beth [00:20:06] And then there is the ACCESS Act because everybody loves an acronym in Congress. Augmenting Compatibility and Competition by Enabling Service Switching. This would mandate that dominant platforms maintain certain standards of data portability and interoperability, making it easier for consumers to take their data with them to other platforms. And this is probably the proposal that I am most interested in because I think it gets to some of our big concerns. If you think about Section 230, what happens if we repeal Section 230, a flood of lawsuits? That's it, right? We would still have the Wild West. It would just be the Wild West enters the court system and you would have judges trying to make difficult calls on when these platforms should be liable and which platform has the capacity to have that liability and which does not. And I think that would just be a mess. But looking at the business model and thinking about who owns the information and protecting our privacy and protecting our ownership of our own content that we create for these platforms that makes me feel optimistic about where some of this conversation is going.

Sarah [00:21:12] Well, and we have Big Tech reformers being appointed to positions of power inside the administration, including the Federal Trade Commission. And I think that is reflective of the realization that the power of these companies comes from their size. And that is even true with the valid concerns about former President Trump being removed from the platforms. These platforms have so much power if they can silence a former president, right? But the power is not in the content moderation. The power is in the size of the platform and Section 230 gets at the content moderation, but does not get to the size of the platform. And to me, that's where the focus should be. The focus should be on the size of these companies. They're anti-competitive practices. We've been talking a lot in Pantsuit Politics land about platform neutrality, and that's sort of the approach of the new generation is they don't feel any loyalty to a platform. Wherever the content is succeeding, that's where they'll go and they might be somewhere else new. And so that's the other thing that I think is interesting about the ACCESS Act is it's like it's actually responsive to where people are in their usage, which is hard to be. Because the usage changes so dramatically in a space like the internet, but I think acknowledging that for better or for worse, the First Amendment is going to be very difficult to get around if we want any sort of government role in content moderation. And so the best way to deal with this issue is to face the sort of beating heart of it, which is the size of these companies. 

Beth [00:22:40] And I also think that some of this legislation sounds necessarily adversarial to these tech companies. I do think so many of the people within these companies want some kind of structure. A lot of the activity from these companies right now looks nefarious and is really just them operating in a sea of uncertainty and a structure that is not built for this day and age and for the tools that they've created. There is a responsibility within these companies, yes, but they have so transformed our society that I think there is a societal responsibility to come back around them and put some guardrails up that many of these companies would welcome. 

Sarah [00:23:31] You know, I was excited to have this conversation because internet regulation is back in the news due to the Facebook files, the blockbuster reporting from The Wall Street Journal about internal documents and Facebook that showed any number of things. The one that's made the most headlines is that they knew that Instagram was incredibly damaging to teenage girls. And you know, this is coming from whistleblowers asking for whistleblower protected status within these companies. I think for so long we said they're innovators. They're new. They say they want to help the world. We'll just take their word for it. But we don't exactly understand what this is or where it's going. But that time has passed. These are our railroads, right? These are giant monopolies affecting because we're not in the industrial age anymore. We’re in the technological age. So it is these technological monopolies that necessitate regulation because what we learned in the industrial age and it shouldn't surprise us in the technological age is that they are not going to be motivated to do the right thing for any number of reasons. And not just because, you know, it's easy to pick on Mark Zuckerberg and call him evil. But because of the structures of these corporations and the way that our economy functions and for a million different psychological and just human reasons, we don't need to learn this lesson all over again. That depending on giant corporations that play an outsized role in our economy, to do the right thing is not a strategy. It's just not. 

Beth [00:24:59] Well, and look, doing the right thing and deciding what that means in these contexts is hard. You read some of the cases that come before the Facebook court. They're not easy decisions to make. The principles at work here have not been democratically decided. I think if you put 10 people in a room, 10 very reasonable people, you would see a lot of disagreement on any of the questions that you put in front of them about what is the right thing here. Because speech is hard, knowing what is true in the technological age is hard. Deciding what constitutes misinformation versus a, you know, terrible opinion is hard. And so I think that's why real focus and structure is important here. And I also think it's why getting that focus and structure is going to require something different of Congress than we've asked of Congress before. In some ways they are our railroads, but in other ways, the pace of change and the depth of questions presented is a whole new universe. And it's difficult to imagine for me that our Congress, which just a couple of years ago was coming to understand how Facebook made its money, can keep up with that and answer those questions. But I'm glad that we have some global models for them to observe and learn from because we've got to catch up to this one way or another. 

Sarah [00:26:28] And we thank you for joining us here for this conversation, and we know it's a hard conversation to have, it's a difficult subject to think about. It's complicated. It touches on all our vulnerabilities, not just about our knowledge about how these tech companies function, but our participation within them. But we think it's important. We hope that you learned something in this episode. We will be back with you next week and until then, keep it nuanced y'all. 

Beth [00:27:03] Pantsuit Politics is produced by Studio D Podcast Production.  

Alise Napp is our managing director.

Sarah [00:27:08] Megan Hart and Maggie Penton are our community engagement managers. Dante Lima is the composer and performer of our theme music. 

Beth [00:27:15] Our show is listener-supported. Special thanks to our executive producers 

Executive Producers (Read their own names) [00:27:19]  Martha Bronitsky, Linda Daniel, Ali Edwards, Janice Elliot, Sarah Greenup, Julie Haller, Helen Handley, Tiffany Hassler, Barry Kaufman, Molly Kohrs.

The Kriebs, Laurie LaDow, Lilly McClure, David McWilliams, Jared Minson, Emily Neesley, Danny Ozment, The Pentons, Tawni Peterson, Tracy Puthoff, Sarah Ralph, Jeremy Sequoia, Karin True, Amy Whited, Emily Holladay, Katy Stigers.

Beth [00:27:51] Melinda Johnston, Joshua Allen, Morgan McHugh, Nichole Berklas, Paula Bremer and Tim Miller.

Alise NappComment