The Will to Act

Armon Owlia: Coming up on “This American Divide…”

Owlia: The dominoes had to fall in precisely the right order and time to create the world we live in.

Sharon McMahon: It treats these social media companies as, you know, grants them broad immunity.

Frances Haugen: Almost no one outside of Facebook knows what happens inside of Facebook.

Elon Musk: I don’t need to listen to what they say.

Owlia: There seems to be a "Wizard of Oz"-level of mysticism…

Owlia: Do you like what you see?

(static effect)

Michael Smerconish: Is social media harmful? That’s the question being raised and debated.

(static effect)

Joe Fryer: We’ve got families all over the country who are dealing with this.

(static effect)

Tom Costello: As anti-Semitic content has surged on Twitter after Elon Musk, who emphasized free speech, took full control. Use of the N-word also jumped 500 percent.

(static effect)

Jo Ling Kent: After close reviews of the President’s recent tweets, it banned him, “due to the risk of further incitement of violence.”

(static effect)

President Joe Biden: It was an enraged mob that had been whipped up into a frenzy.

Alisyn Camerota: The alleged attacker posted conspiracy theories on Facebook.

(static effect)

Brian Latimer: The social media business is evolving a lot these days.

(static effect)

Alisyn Camerota: “Spider-Man” star Tom Holland announcing he’s taking a break from social media for the sake of his mental health.

(static effect)

Jake Tapper: Cherished ideals of free speech are in the hands of erratic billionaires.

Owlia: It’s time to examine “This American Divide.”

Owlia: Complex problems always have complex backstories and explanations, political polarization in America is no exception. While it’s easy to blame just one group or factor for any issue we face, the truth is that such an approach doesn’t get us anywhere in a time of complexity. The dominoes had to fall in precisely the right order and time to create the world we live in. We've gone over those actions, including the intentional division of America to put Richard Nixon in the White House, the weaponization or demonization of the press, depending on where you lie on the political spectrum, the development of social media and, inevitably, the opportunistic ways it would be used by politicians. However, a few more dominoes we have not discussed needed to fall. Funnily enough, these dominoes were not meant for hate or intentional division. These dominoes were laws created by people who had their hearts and intentions in the right place. They were created to ensure the growth and freedom of the early internet. In the three-way Venn diagram of failure between social media companies, the government, and the overall human tendency to be reactionary, lies a set of laws called the Communications Decency Act of 1996, with one in particular standing the test of relevancy. 

Sharon McMahon: I’m Sharon McMahon. I’m a Jefferson Award winner for Outstanding Public Service By A Private Citizen. I’ve been in a variety of publications, news outlets, you know, that sort of thing.

Owlia: McMahon, also known as "America's Government Teacher," has combated against misinformation through her podcast, "Sharon Says So."

Owlia (in conversation with McMahon): What exactly prevents the government from taking any action in terms of creating and/or enforcing social media regulation?

McMahon: Well, Section 230 of the Communications Decency Act is part of it.”

Owlia: Section 230 is often regarded as the law that allowed the internet to develop into what we know today. But why is it so important? Let's hear from the Congressional testimony of one of its co-authors, current Democratic Senator Ron Wyden of Oregon.

Ron Wyden: Section 230 was absolutely necessary to bring our legal system into the 21st century. It has been the legal foundation for the growth of the Internet, particularly in areas like education, jobs, and a platform for free speech around the world. 

Owlia: Wyden, alongside former Republican Representative Chris Cox of California, understood that the internet, at least in 1996, was novel and needed further exploration. Laws needed to be written to give internet companies tremendous freedom to grow and evolve, as Wyden further explained.

Wyden: Now, when I wrote Section 230 more than 20 years ago, it was in recognition of the fact that the Internet was just going to change everything, the way we interact with each other, the way we do business. It would change virtually every corner of our lives and our society. And we understood that no amount of legislation and political bloviating could stop the change, but we could influence how it came about. 

Owlia: The law itself is very long, and I will not subject you to all of it. However, one clause must be addressed because it makes Section 230 crucial to understanding any difficulty in finding a solution. Clause C, also known as the "Good Samaritan" clause, reads, “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. No provider or user of an interactive computer service shall be held liable on account of any action, voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively, violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or any action taken to enable or make available to information content providers or others the technical needs to restrict access to materials described in paragraph (1).” So, what exactly does that mean?

McMahon: It treats these social media companies as, you know, grants them broad immunity from what's being posted on their platforms.

Owlia: In other words, social media companies cannot legally be held liable for content posted on their platforms. Before you start yelling, you need to understand that, in 1996, such a clause was necessary because so much was unknown about the internet and its power. Hell, in 1996, who could conceive of a social networking site? Who could have predicted that Facebook, Google, YouTube, Twitter, and Instagram would not only exist but also become a conduit to original content and information in the ways that television, radio, and newspapers are now? In 2023, however, it's not only obvious, but there’s a clear difference: television, radio, and newspapers are all held to such a level of accountability that if even one piece of misinformation hit their airwaves, their credibility and ability to operate would be severely damaged, if not destroyed. With such autonomy, social media companies can do whatever they want, however they see fit regarding content. They do, in fact, have the power to make changes as they see fit. This is Frances Haugen, who not only leaked material about Facebook's operation to the Wall Street Journal but also testified before Congress on the ramifications of said leak.

Frances Haugen: Only Facebook knows how it personalizes your feed for you…Facebook hides behind walls that keeps researchers and regulators from understanding the true dynamics of their system.

Owlia: We've previously established that Facebook and other social media organizations have control of every aspect of the platform, notably the algorithm that keeps people plugged in and contributes to the spread of misinformation.

Owlia (in contact with McMahon): In your professional opinion, do you feel social media companies should hold some level of responsibility for the effects their product has on younger people?

McMahon: Yes, in that they are the creators and originators of the algorithms that can, have been shown to negatively impact people. That is, that's their own creation. They can change at any time they want to. They can take, they have the capacity to take steps to mitigate against some of the negative effects of social media. They have the ability to police their platforms if they choose to.

Owlia: As a result, social media companies could actively choose to make changes that would, overall, be highly positive. However, there seems to be a "Wizard of Oz"-level of mysticism that the organizations put up for the public and government. This mysticism and, to a lesser degree, leverage that the companies have prevent the government from taking action.

Haugen: We are given no other option than to take their marketing messages on blind faith. Not only does the company hide most of its own data, my disclosure has proved that when Facebook is directly asked questions as important as, “How do you impact the health and safety of our children?,” they choose to mislead and misdirect. Facebook has not earned our blind faith.

Owlia: However, it's not, surprisingly, solely a matter of whether or not Congress wants to do anything about the technology companies.

McMahon: Most members of Congress that I am aware of do not hold the social media companies in high regard. In fact, you know, the people who are on the left dislike social media companies for different reasons than people on the right. But nevertheless, they both have contempt for the social media companies. You know, they have a common enemy. They dislike that enemy for different reasons.

Owlia: So, if social media is a common Congressional enemy, why exactly hasn't there been any action? The answer is simple, almost laughingly simple.

McMahon: The other part of it has to do with lack of congressional understanding about how the Internet actually works. A lot of people in Congress are not avid Internet users in the way that people, younger people are. And this is not like, it, meant to be ageist or anything of that nature, they're certainly capable of understanding if they want to, but it is a tremendously complex thing to begin to wrap your mind around if you are not an avid Internet user that some younger people are. You know, the average ages in Congress are quite a bit older than average ages around the rest of the country. So, that's part of it. I mean, even the Supreme Court last week, Justice Elena Kagan said…

Elena Kagan: We're not, like, you know, the nine biggest experts on the Internet.

McMahon: Like, even she had the self-awareness to be able to say, like, ‘We're not who needs to be deciding this.’ And these are some of the brightest minds in America, right? Chosen through a very careful vetting process and wildly highly educated and even they're like, you know? Elena Kagan used to be the Solicitor General of the United States. She used to be the, the, the lawyer for the United States of America.

Owlia: At almost every level, the government doesn't truly understand how the internet works or even the legality. So, when social media companies promise regulation or say they can do it without government intervention, it's naturally easy to take them at their word if the government doesn't know what to do. Once again, Frances Haugen.

Haugen: During my time at Facebook, I came to realize a devastating truth: almost no one outside of Facebook knows what happens inside of Facebook.

Owlia: And for those who think this is isolated to Facebook, the purchase of Twitter by Elon Musk is a prime example of what happens when trying to regulate a platform internally goes terribly wrong. Before Musk's acquisition in October 2022, Twitter tried to enforce certain levels of regulation regarding misinformation and hate speech, most notably providing brief fact-checks on COVID-19 tweets, including those about vaccines and masks that contained misinformation. After the events of January 6, Twitter took the extraordinary step of banning Donald Trump from the platform, along with others who helped encourage the insurrection and violated Twitter’s own rules about hate speech. In a tweet shortly after buying the platform for $44 billion, Musk stated he had purchased Twitter to prevent it from being…

Tom Costello: “A free-for-all hellscape, where anything can be said with no consequences.”

Owlia: Musk instituted major changes quickly, such as announcing a "Content Moderation Council" with diverse viewpoints, reinstating figures such as Kanye West and Donald Trump to the platform, despite being banned for defaming language, hate speech, and, in the case of Trump, planting misinformation to aid and abet a riot. Musk also announced the Twitter Blue paid subscription for $8 monthly, gifting the iconic blue checkmark previously given after an intense screening process to a small few, including public figures and journalists. However, the results of these changes brought a litany of controversies that only made the misinformation crisis worse. As a result of bringing back users such as West and Trump, hate speech on the platform skyrocketed.

Errol Barnett: Hate speech is surging on Twitter following Elon Musk’s takeover, despite his claims last month they’d fallen on the platform. According to findings from the Center for Countering Digital Hate and the Anti-Defamation League, posts which include racial slurs against black people have trebled, slurs against gay people are up 60%, while there’s been a 61% spike in antisemitic tweets referencing “Jews” or “Judaism.”

Owlia: Because anybody could buy a blue checkmark on Twitter, creating misinformation with catastrophic results became easy.

Savannah Guthrie: Wall Street is watching pharmaceutical giant Eli Lilly today. Its stock plummeted last week after someone impersonated the company on Twitter. Said it would make insulin free. Twitter had changed its policy allowing anyone to acquire a blue-verified checkmark simply by paying an $8 fee.

Owlia: Musk's tendencies, too, have encouraged high amounts of misinformation on the platform. Remember that "Content Moderation Council" he announced? Not only has such a council not been formed at the time of writing, but listen to what Musk himself had to say about it.

Elon Musk: I just want to be clear about, we are going to do a Content Council, but it’s an advisory council. It’s not a…at the end of the day, it will be me deciding and any pretense to the contrary is simply not true. Because, obviously, I could choose who’s on it, the Content Council, and, I don’t need to listen to what they say.

Owlia: On top of that, in December 2022, Musk dissolved the crucial Trust & Safety Council that has been offering advice on countering misinformation and has repeatedly posted misinformation on topics such as the attack on Paul Pelosi, the husband of former House Speaker Nancy Pelosi, and the COVID-19 pandemic. He has also repeatedly stated that the algorithms that run social media have a left-wing bias. However, numerous studies have shown that, in reality, the exact opposite is true. So, what exactly is the point of all of this? The days of social media claiming they can self-regulate are over. Time and time again, social media companies have proven that self-regulation not only doesn't work, but the lack of transparency means they can do whatever they want behind closed doors. Once again, Frances Haugen, in her Congressional testimony.

Haugen: Congress can change the rules that Facebook plays by and stop the many harms it is now causing.

Owlia: It's not a matter of "if" but rather "how" and "when." Social media companies need to be held responsible and accountable for their actions, and if not by the government, then by who else? Once again, Sharon McMahon. 

McMahon: You know, I don't know anyone who is educated on this topic who is like, “Yeah, they're doing enough.” And, you know, one of the things that they, this is one of the issues before the Supreme Court right now is how responsible are they for their algorithms? How, you know, like to what extent should they be held responsible for the algorithms that push harmful content to people, whether they're a minor or an adult? YouTube is like finding a needle in a haystack, right? There are billions of videos on YouTube and they want to keep you on the platform because that's how they make money off of you. And so consequently, they're going to push content to you that it thinks you will enjoy based on what you or your travels around the Internet, not just your use on the site, your travels around the Internet as a whole. That's not my responsibility as a user. I can't control any of that. That's their responsibility to, to responsibly use algorithms and to responsibly profit off of their users. It's not your responsibility to profit off of their users. So, no, I don't think they are being held accountable to the extent that they should be. I do think they're, personally that they are responsible for their algorithms. They made them. Nobody else has control over them but them. And in some ways, they have failed to anticipate the repercussions, the huge repercussions that their algorithms have had in the real world.

Owlia: Misinformation will continue to go in circles without the proper knowledge and transparency. However, we've seen that when an issue is important enough for Congress, they learn as much as they can to handle it. We are at a tipping point; such an initiative must be found quickly. Regulation and education. These are two of the tools necessary to tackle this issue. We've talked about one already. But the other? Well, let's open the textbooks.

Previous
Previous

Only Human

Next
Next

Bylines and Bipartisanship