September 28, 2021

Casten and Witness Agree: “Misinformation is Killing People”

Washington DC – Today, during a hearing on the Disinformation Blackbox held by the Science, Space, and Technology Subcommittee on Investigations and Oversight, U.S. Representative Sean Casten (D-IL) questioned misinformation experts Laura Edelson, Co-Director of Cybersecurity for Democracy at New York University, as well as Dr. Alan Mislove and Dr. Kevin Leicht on the catastrophic repercussions of misinformation and disinformation social media platforms and their ongoing efforts to revoke independent researchers like Edelson's access to data.

In 2018, Rep. Casten questioned Facebook CEO Mark Zuckerberg, who refused to state whether Facebook exempted high-profile users from their misinformation and hate speech rules. Recent reporting from the Wall Street Journal not only confirmed this, but revealed that Facebook's use of a "Whitelist" shielding millions of VIP users from the companies' standard content moderation practices as Americans face the deadly consequences of Mr. Zuckerberg's "company over country" mantra: an infodemic where the viral, largely unchecked spread of misinformation online has jeopardized public health, enabled terrorist recruiting and communication, and continues to thwart efforts to vaccinate Americans against a pandemic that's already killed well over half a million Americans.

A transcript of Rep. Casten's questioning can be found below:

Click here to watch Rep. Casten question Ms. Edelson on Facebook's exemption list:

Click here to watch Rep. Casten questioning Ms. Edelson on the deadly consequences of COVID19 misinformation and disinformation

Click here to watch Rep. Cause questioning Ms. Edelson on Facebook's role in the January 6th attack on the Capitol:

Find the transcript from the hearing below:

Transcript

Rep. Casten:

About three years ago—it's relevant that this was before COVID and I feel somewhat prescient in an angry way—Mark Zuckerberg testified before Financial Services Committee and I asked him in the first instance, whether they would suppress anti vaccine information if it came from Jenny McCarthy's Facebook page, and then separately whether they would suppress information from the American Nazi Party, if it came from Art Jones Facebook page. Art Jones at the time had just won the Republican nomination to run for Congress in Illinois Third Congressional District. His answers were unsatisfactory and seem to suggest that the content of the information was one question and the speaker was another. I mentioned that because the recent Wall Street Journal reporting that they are in fact whitelisting certain high profile people, suggests that this problem has not been solved.

I'd like to start with Ms. Edelson, because it sounds like you've spent a lot of time thinking about this. Do you see a disparate approach to information protocols depending on the speaker in your research as we sit right now?

Ms. Edelson

There certainly currently exists, as we all now know, two separate systems on Facebook, where some speakers are effectively not moderated at all. And then there's everyone else. I think this is almost entirely backwards, because what Facebook has set up is a situation where these speakers who have the widest reach are free to spread whatever lies they choose, and it will take a long time for Facebook to act–and often–Facebook won't act at all. I think that we do, you know, this is where I think there is a difference in how we think about content moderation versus how we think about content promotion. I think that speakers that have a bigger audience should have a bigger responsibility to ensure that the information that the platforms spread on their behalf to their audiences is factual.

Rep. Casten

I think we're all fond of the framing that freedom of speech and freedom of reach are two separate things, and I think sometimes we allow them to amplify horrible messages that would go away if we just limited freedom of speech.

Rep. Casten

Shortly after you released your results, which found that people who rely on Facebook for information have substantially lower vaccination rates than those who rely on other sources. Facebook cut off your access to data. I think your research said that people rely exclusively on Facebook for news, 25% of them do not intend to get vaccinated. Now I understand that I appreciate in your text I think you said Facebook is using privacy as a pretext to squelch research that it considers inconvenient and that I worry sometimes that that sounds like well we don't do some research, how much does that really matter. I realized we're all math and science nerds here at least since Mr Perlmutter has been able to continue. But at core, this is an epidemiological question right? If we know that certain behaviors increase the rate of spread of a communicable disease the rate of contraction of communicable disease. There are consequences. And we do epidemiology right and people live, you do it wrong people die. Can you speak at all to the consequences of your inability to do, what is at core, epidemiological research?

Ms. Edelson

Yeah, I think you're right. I'm willing to say this—misinformation is killing people. We have had a safe and effective vaccine for COVID for a long time now. We're back over 2000 deaths a day. Facebook is not the only reason this is happening, but it's certainly contributing, because of exactly that study you cite and that I personally keep in mind. Right now, there is vaccine misinformation that is widespread and easily available on Facebook. I know this because I have colleagues who still do have access to Facebook, who find it and try to report it every day. And it's really, really hard for those folks because they do not feel like the platforms are their allies in this, and Facebook, again this is something that Facebook's own research has pointed to, and Facebook has just chosen not to fix.

Rep. Casten

A couple of weeks ago we recognized the 20th anniversary of 9/11. And among the things we recognized was the complete heroes on flight 93, who in a largely pre Internet era, on a plane, within 10 minutes we're able to deduce that there was about to be a terrorist attack on the United States Capitol and got together to stop it. Is it reasonable to assume that in the more recent attack on the US Capitol, given how much was being amplified on Facebook that a bunch of smart computer nerds at Facebook had knowledge a priori of what was being organized, because those 40 people in 93 figured it out.

Ms. Edelson

Yeah, I'm sorry this is really, I worked on Wall Street on 9/11, that was a bad day was a really, really bad day. And I will remember the morning of January 6, because I told my team that morning that I thought it was going to be a bad day, because this is, you know, this is what I live in breathe I look at this stuff every day and it's awful. I don't know if anyone at Facebook knew it was gonna be a bad day, I don't. I don't work there. But one of the things we do know is that their internal research has been telling them about the extremist problem for years, they knew that their algorithm was promoting hateful and extremist content. They knew that there were fixes. They knew that those fixes might come at the cost of user engagement, and they chose not to put those fixes into place, so as to whether anyone knew on January 6, I don't know, but they knew about the problem. They knew how to fix it. And they chose not to.

# # #