Remember when social media sites helped legitimized the now-discredited Steele Dossier, labeled the Hunter Biden laptop story as fake, and completely removed a sitting President of the United States from their platforms? These (and many others) are the reasons for the growing distrust for Big Tech.
But recently, the U.S. Supreme Court announced they will hear a case that will potentially provide valuable guidance as to the scope and application of Section 230 of the Communications Decency Act of 1996. This is the federal statute that allows immunity to social media companies for what other people post on their platforms and also the power to remove content at their discretion. On this episode, we talk about the case as a way to hold big tech accountable, the potential outcomes, and get valuable insight as to where the justices might land in their decision.
Tim Doescher: Folks, thank you so much for listening to the Heritage Explains podcast. We love doing this thing every week for you, but we also do a lot of other podcasts here at The Heritage Foundation, and I wanted to make sure you knew about them. We've got the Kevin Roberts Show. We've got Problematic Women, the Daily Signal podcast. We've got SCOTUS 101, and we've got Heard at Heritage. Now, all of these podcasts present a different angle of the conservative cause, the conservative movement, which we count you a part of if you listen. So head over wherever you listen to podcasts and subscribe to all of them right now before we start this episode.
Doescher: From The Heritage Foundation, I'm Tim Doescher, and this is Heritage Explains.
NBC News: Nohemi Gonzalez, those close to the 23-year-old simply called her Mimi.
Sandra Felt: She was very warm, very caring.
NBC News: Nohemi's aunt says the student from Southern California was in Paris, fulfilling a dream. Fascinated with the city, Nohemi wanted to learn French, so the design major from Cal State Long Beach signed up for a foreign exchange program.
Sandra Felt: Very much of a go-getter. Everything she wanted to do, she went after it, and she found the way to get it done.
NBC News: Her determination shattered when a gunman burst in. Nohemi was eating at a restaurant with two friends Friday night when she was shot in the stomach and killed.
Doescher: That tragic 2015 story taken from NBC News replays the horrible terrorist attack in Paris. If you recall, especially back then, ISIS was a very scary force for evil in this world, and this was one of the many reasons why. It's a devastating story to relive, but the story doesn't end there. Since Nohemi's tragic murder by terrorists, the Gonzales family sued Google through its YouTube subsidiary.
ABC News 30 Video: An Oakland federal court judge heard a motion to dismiss today filed by attorneys for Google and its media platform, YouTube. They're facing a lawsuit that says ISIS uses YouTube as a tool and weapon of terrorism.
ABC News 30 Video: These companies just simply have to do something because they continue to allow these nefarious organizations to harm Americans, to harm other people, and they do nothing and they don't act reasonably.
ABC News 30 Video: The lawsuit filed by the Gonzales family says ISIS wouldn't even exist without YouTube. The videos like these exaggerated the terrorist power and help them recruit fighters.
>>> Big Tech’s Big Moment Finally Arrives at Supreme Court
Doescher: That ABC News 30 video taken from 2017 was just the start. The lawsuit alleges that by allowing its algorithms to recommend video content from the terrorist group, they should be held partially responsible. Now enters the Section 230 debate we've covered on Explains in the past. Just as a reminder, Section 230 of the Communications Decency Act, or better known, the 26 words that built the internet as we know it, effectively immunizes interactive online providers, like YouTube, from liability when they make targeted recommendations of information provided by another source.
Doescher: But what happens when those content providers are, let's say, ISIS? Should a company like Google still be immune from liability? What does this say for the larger issue of content moderation and online censorship? Until now, these questions have been bouncing back and forth throughout the court system, but recently, the US Supreme Court said they will hear the case and, hopefully, provide much-needed guidance.
Doescher: So what should that guidance be? What should it look like? Sarah Parshall Perry is a senior legal fellow in the Meese Center here at the Heritage Foundation. She has a great analysis of this case, as well as the dynamics at the highest court in all the land and all that's at stake. On this episode, she explains.
Doescher: Sarah, Section 230 has been a hot issue in the tech policy community for several years. There's no question about it. These words have been credited with building the internet as we know it today, and it's now under scrutiny at the US Supreme Court. So this is the deal. This is happening right now. All of the tech policy, all the disagreement is now going to come to a place at the US Supreme Court. It's going to have an ear, and we explained it at the top of the episode just a little bit, but I wanted to give you a chance to give us a short rehash of what Section 230 of the Communications and Decency Act of 1996 is.
Sarah Parshall Perry: So I'll give you a little history first because that'll help you understand where this particular law came from. In 1996, a New York Court held an internet publisher liable for defamation, based on information that was posted by one of the internet provider's users. Now, we don't want individuals held liable for secondary content that's hosted on their platform. So it was meant as a shield to regular civil penalties and civil laws that would otherwise attach for private content. Unfortunately, what was ultimately designed to limit the spread of pornography and expand free speech in the internet sphere, has worked in precisely the opposite way.
Doescher: Okay, so they started with going, "Hey, we're going to get rid of porn showing up in little kid's emails and all that kind of stuff. But then, we built social media around Section 230, which allowed anybody to post something, say on Facebook. They could post whatever they wanted on there and Facebook would say, "Well, that was pretty bad. We shouldn't be held accountable for what that person posted." Okay, so that was what Section 230 has really become.
Parshall Perry: Yes.
Doescher: Okay.
Parshall Perry: And part of the problem with where we are now with Section 230 is that while the statute itself does not require political neutrality, it was never anticipated to be required at the time because in 1995, 1996, when the legislation was in drafting, no one thought that the big tech cabal would grow to such an outsized level of influence that we would need political neutrality, but we find ourselves now, as conservatives, on the receiving end of the brunt of internet censorship, no matter the big tech platform.
Doescher: Okay, so there's two parts of Section 230. There's an immunity part, which was what we were talking about. You can't be held liable for what someone else posts on your platform. Then there's a Good Samaritan portion of it, which allows them to then go ahead and remove content. Am I getting that right?
Parshall Perry: Yes, and the Good Samaritan portion is the part that helps us to understand seeing big tech, like for example, a newspaper. In other words, if the New York Times ran a guest essay that had a defamatory claim, you couldn't sue the New York Times. They would be shielded as a publisher, merely a third party on this platform. You'd want to sue individually, the information, the person who extended the defamatory information.
Parshall Perry: That was the thinking behind using Section 230 in precisely the same way for the internet sphere. It was, in a way, designed to allow them to act like traditional publishers, magazines, newspapers, TV outlets, so that it was just one more avenue of information exchange, and it would protect them from being sued in the same way that, for example, the New York Times could be protected from being sued. It doesn't require political neutrality. It allows them to exercise "good faith" moderation, and that's direct statutory language, but what that "good faith" has turned into is, again, instead of a shield from 230, now we're seeing this sword that is ultimately directed precisely at the interests that are politically inconvenient or politically unpopular.
Doescher: Think the Hunter Biden laptop story.
Parshall Perry: Right.
Doescher: Yeah, that's a classic example.
Parshall Perry: Or pandemic information, and the way we discovered that the government was involved directly in moderation of manipulation of information, like if you election information as well. We've also discovered incidents of malfeasance where certain platforms were paid to throttle up or throttle back. When we see the government get involved in stories like this, that, to me, brings us closer to more direct government involvement, and where the government is involved, the First Amendment applies.
Doescher: All right, let's get into the case that the US Supreme Court is going to hear. I feel like I'm in law school right now being your professor, asking for the facts of the case. Gonzales v. Google, which, by the way, just gave me chills thinking about that. Just give us the facts of Gonzales v. Google. How does this case play into the 230 debate?
Parshall Perry: So this is a claim that was brought under the Anti-Terrorism Act by the family of a young woman who was the only American killed in the 2015 terrorist attacks in Paris. Now, the family advanced an anti-terrorism claim against Google, YouTube's parent company. Their claim was ultimately, "You aided and abetted her killing by recommending through your algorithms to radicalized Muslims, those individuals who you believed would benefit from jihadist videos."
They've said, "No, no, your claim shouldn't succeed. It should fail as a matter of law. We're shielded by Section 230." They've said, "It was simply computer algorithms that suggested this information," to which the petitioners, the Gonzalez family here have said, "Once you are making moderated decisions to recommend particular content, you're no longer acting like a traditional publisher. You're not making a decision like the New York Times would make where it would be, we have reason to believe this is obscene. We have reason to believe it's defamatory. We're going to exercise again, good faith judgment, and we're going to pull it off or we're going to let it run." No, in this case, they took a subset of individuals that they knew were using jihadi videos, directed the algorithms to those individuals, and the claim of the Gonzales family is you violated the Anti-Terrorism Act.
Doescher: Wow. In my head, I'm saying, "Okay, so this is content versus content recommendations."
Parshall Perry: Correct.
Doescher: To me, that seems like a big difference worthy of, and that's probably why they granted the case, to hear the case at the Supreme Court. Just boil it down for us here, Sarah, what are the stakes in this case? What would big tech and their special interest groups say? What do we say?
Parshall Perry: Well, for us, this is a very simple case. First of all, as you pointed out, it's a very discreet question. So it boils down to, "Is it traditional editorial functions?" Okay, again, I'm going to use the New York Times newspaper as the example, or are you doing something else? And then here, in this case, the Gonzalez family is arguing, and we would argue, that content recommendation, targeting of certain audiences, and boy, did we see that in the last go-round of elections. Yes, we saw that in the pandemic information, vaccine information, the examples are myriad.
We've seen platforms take their immunity and abuse it, turn it into a shield by saying, "We know these people are using X information for Y purpose. We are going to put more money into building algorithms so that this is the only information that they see. So for us, the minute you start taking those manipulative approaches, building out algorithms to direct only one set of information, only one perspective, to a particular group of people, you have ultimately done away with what Section 230 was designed to do, and that is, promote free speech and free exchange of ideas.
Doescher: It's amazing. And again, our heart and soul here at Heritage, we want to hold Big Tech accountable.
Parshall Perry: Yes.
Doescher: We really, really do. With that, I know that as I was reading through this, it seems like we're only hitting on the big technology, they have immunity here.
Parshall Perry: Right.
Doescher: We're taking on the immunity part of Section 230 here. We're not going to be taking on the "Good Samaritan" standard content moderation standard just yet. So my question is, and I've been thinking about this if they go hard on the immunity thing, does that mean that big tech will then answer back and it'll strike back with even more content moderation? Well, okay, if we can't be held immune from this, then we're just going to go crazy in terms of content moderation. Is that opening the door?
Parshall Perry: Well, I think it's always a possibility when you have to look at the text and the precise wording of a particular statute, but this Supreme Court has made known that they are originalists when it comes to the Constitution, and they are textualists when it comes to statutory law. Those are two things that we want. We don't want anyone to benefit from the ambiguities that might be present in a particular statute. So because this was a vision to be a traditional editorial protection, when you begin to do things like recommend particular content to drive it to a particular audience, that, I believe, the Supreme Court particularly because as you and I were talking about before this, Justice Clarence Thomas, as recently as 2020 said, "There will come a point where we will need to determine if Section 230 is doing what it was intended to do when it was passed in '96." We believe firmly that this Supreme Court will go to the precise statutory language and it will be guided by congressional intent and what the words actually say.
Doescher: Yeah. By the way, and this is just one piece of holding big tech accountable at the Supreme Court, and you talk about this in your piece, which I'm going to link to folks because it is a great overview of what is at stake at the Supreme Court in Gonzales v. Google. Talk a little bit more about some of the other things that are working its way through the court system right now in terms of holding big tech accountable.
Parshall Perry: Well, it's interesting because we started this segment talking about the fact that there were some circuit splits on this precise issue. Five circuits have taken the approach that Google wants. Three circuits have taken the approach that the Gonzales family wants. Within those circuits, ironically enough, the States of Florida and Texas have state laws outlawing political censorship on big tech platforms.
Doescher: Yes.
Parshall Perry: They have both filed what are called petitions for [inaudible 00:16:39] or petitions for review. The Supreme Court has determined that it will take up those cases, and then there is a related case from a 2017 terrorist incident in which another family of another deceased young person is bringing a claim also under the Anti-Terrorism Act, but this time, against Twitter for the exact same purpose, for aiding and abetting a terrorist act in the ultimate killing of their family member. So there will be more tech just in this term alone at the Supreme Court.
Doescher: Yeah. I think about this, and we've covered this before in terms of Florida's up to, what Texas is up to. We have admitted, "Hey, this is going to end up in the courts. This is going to be decided by them." Give us just a little bit of a sense. You mentioned Justice Thomas, but this is a Roberts's Court, this isn't a Justice Thomas court. This is Chief Roberts court. What do you think where this is headed under his purview?
Parshall Perry: I will tell you, we can take a look at some of the COVID litigation, and I think that gives us a bit of an indicator, a flashing neon sign as to what I think this court is going to do. If you'll remember, the Supreme Court struck down the CDC eviction moratorium by saying that particular legislation was never intended to allow tenants to press pause during a pandemic on their rent or mortgage payments, completely outside the statutory authority.
Parshall Perry: I will give you another example. Under the Occupational Safety and Health Act, the National Vaccine Mandate, same thing, never intended to do that. It was meant to govern workplace safety conditions. It was primarily directed at asbestos. It was never related to stick a jab in everybody's arm, regardless of whatever you think about the vaccine. Then, toward the end of the term, they were faced with some other questions, one of which was whether or not you could force active duty service people to violate their religious consciences by taking the vaccine itself.
Parshall Perry: In all instances of statutory authority, advanced by progressive interest within the Biden administration, the court said, "It wasn't anticipated by the statute, not in the text. You can't go further than the text." I am hopeful, based on just that, that they'll find the exact same outcome here in the Section 230 case.
Doescher: Well, Sarah, the case hasn't even been heard yet, and we're going to have some fun tracking it and then, when a ruling comes down, we'll have you back here to go over what they ruled, but in the interim, thank you so much for being here.
Parshall Perry: Thanks for having me.
Doescher: That is it for this episode of Heritage Explains. Like I said at the top, please go and subscribe to all the Heritage podcasts, and for ours, go ahead, hit the like button. You can share it with your friends, your family, anybody who might be interested in our brand of podcast. We're pretty proud of it, folks. We do a lot of work to produce this for you every week. So again, thank you so much for listening. Michelle is up next week.
Heritage Explains is brought to you by more than half a million members of The Heritage Foundation. It is produced by Michelle Cordero and Tim Doescher, with editing by John Popp.