upper waypoint

On X, Misinformation About the Israel-Hamas War Is Spreading

Save ArticleSave Article
Failed to save article

Please try again

A large X is seen on the roof of the former Twitter headquarters on July 28, 2023 in San Francisco, California. (Justin Sullivan/Getty Images)

View the full episode transcript.

The Israel-Hamas war has put Elon Musk’s transformation of Twitter to the test. Changes to its verification policy, major cuts to the company’s Trust and Safety teams, and Musk’s own rhetoric have led to a worsening in the spread of misinformation on the platform — with real life consequences.


Links:

Episode Transcript

This is a computer-generated transcript. While our team has reviewed it, there may be errors.

Sponsored

Ericka Cruz Guevarra: Hey, it’s Ericka. Quick announcement. We’re hiring an intern. This is a 16 hour a week paid opportunity that goes from January to June of next year. If you’ve got a passion for local news and podcasts, let’s talk. DEADLINE to apply is November 17th. The link to the app is in our show notes. All right, Here’s the show. I’m Ericka Cruz Guevarra and welcome to the bay. Local news to keep you rooted. Just hours after Hamas gunmen slipped into Israel from the Gaza Strip, unverified photos and videos began circulating on X, formerly known as Twitter. Old repurposed images of armed conflict were being shared and passed off as new by anonymous accounts that had purchased blue checkmarks under X’s premium subscription service.

Davey Alba: A lot of misinformers kind of flooded the zone with their own narratives and spread confusion and fear.

Ericka Cruz Guevarra: Twitter has never really been the perfect source for reliable news. But ever since Elon Musk bought the company a year ago, the spread of misinformation on the platform has only gotten worse. And getting accurate news about Israel and Hamas on the platform feels like navigating a minefield today. Why? The Israel-Hamas conflict was a test for Elon Musk’s ex and how it failed.

Davey Alba: It has been a mess on the platform.

Ericka Cruz Guevarra: Davey Alba is a technology reporter for Bloomberg News.

Davey Alba: Beyond sort of the initial reports about the attack on Israel, there has been a steady drumbeat ever since of misinformation as Israel has turned its attention to Gaza. There’s a video of supposedly pro Hamas supporters blocking off a highway in New York ahead of the so-called Global Day of Jihad, which is when a former Hamas chief called for protests around the world. That was actually a street racer waving a Puerto Rican flag. There have been videos that sought to undermine the work of real journalists.

Davey Alba: For instance, a video circulating asserting that CNN staged footage of its news crew under attack in Israel and the audio was manipulated to serve that narrative. And there have been even just awful denials of the atrocities that are actually happening. So it’s a combination of a lot of complicated factors. We are in a fog of war, as a lot of experts have pointed out, and there are real time events and reporting that needs to happen over time before we really know what happens day to day.

Ericka Cruz Guevarra: As you and your colleagues have reported, the spread of this false information, as we’re seeing in this moment, seems to have increased or gotten worse because of some of these systemic changes. Can you sort of tick through some of these these changes that have led to kind of what we’re seeing now?

Davey Alba: The company loosened its platforms, rules, cut trust and safety employees after previously saying it would expand the team, reinstated banned accounts and allowed people to pay for a check mark on X. I’ll start with the cutting of the Trust and Safety team. Those are the folks who were focused on making sure that users had a safe experience on the site. And that means limiting, upsetting, offensive, hateful content. One thing that they put into place is a verification system where they had verified thousands of notable people who work in journalism, in media, have official positions, and beside their names there would be a checkmark that showed that Twitter itself had verified those folks in place of that.

Davey Alba: Anyone can buy a verified checkmark, the blue checkmark, as they call it. Getting the check mark now allows you to possibly reach audiences even more, given the algorithmic boost that Twitter gives the people who purchase this checkmark. And so, you know, if you’re Amazon former, then you might be motivated to sign up for that verification that that X premium program, as they call it. One thing that happened in just the past few days is an account of someone who claimed to be a journalist from Jazeera posted that they had seen a Hamas bomb fall on a hospital in Gaza. This account was verified with a check mark. Under the new system where anyone can buy a check mark for $8 a month.

Davey Alba: It took a while for people to point out that this person actually didn’t work for Al Jazeera. Eventually the account was suspended. But that’s the sort of thing that is really dangerous when you just glance at the platform and you see all these markers that are supposed to make you trust things that you see on the site. And the reality is that you can’t really trust anything on it right now.

Ericka Cruz Guevarra: It sounds like a new kind of system that actually rewards misinformation and people who spread misinformation on the platform. But there’s also Elon Musk himself. Right. And his rhetoric, his attempt to sort of rebrand Twitter. How has his rhetoric also contributed to some of this stuff we’re talking about?

Davey Alba: When Elon Musk finally stepped into the CEO role, there was a huge surge in hateful speech. Elon Musk has said that he wants Twitter X to be the haven for all free speech.

Elon Musk: A good sign as to whether there is free speech is is someone you don’t like allowed to say something you don’t like.

Davey Alba: A lot of people who post on Twitter take that as well. Oh, now I can say anything anti-Semitic tropes, racist slurs. I think a similar thing is happening with the misinformation around the Israel Gaza conflict in this conflict. He has recommended two accounts that have a track record of spreading misinformation or just unverified information.

Ericka Cruz Guevarra: Where does that all sort of leave people who are still on this platform? I mean, myself included, who are I mean, just people who are trying to figure out, like in good faith what the hell is going on in the world.

Davey Alba: Researchers I talked to in the space have told me that this is unfair and that enormous companies with years of experience and vast reach should be the ones to ensure that the people on their platforms are safe. And that is not what appears to be happening on a basic level. One feature that I will highlight under Elon Musk’s Twitter is community nodes. Community Nodes is a crowdsourced feature where people who are part of the program write notes that add context to any post that’s on the site that may contain misinformation or may be misleading to people. And then people vote on those notes that appear. And if you get enough votes, it gets displayed on the site. That is a thing that replaced some of Twitter’s own decision making on misinformation that’s been going around on the site. The problem with this whole system is that Twitter has basically pushed off the responsibility of keeping everyone on the platforms safe onto the users.

Ericka Cruz Guevarra: Coming up, why this problem goes way beyond Elon Musk and what it means for the rest of us.

Ericka Cruz Guevarra: Is X unique? Or is this something that we should be worried about on a larger scale, like social media platforms becoming like worse and worse places to find facts about breaking news?

Davey Alba: I think that there was always a risk for people to be using social media as their main way of getting their news. The platforms have shown themselves to be unreliable on this front. I think that Twitter was, you know, sort of fully in lockstep with the other social media platforms in their struggles in the past few years, as we’ve seen. Democracy’s at risk because of viral misinformation. But I think we are also transitioning into a new era.

Davey Alba: The companies seem to think that even with all the moderation, they still get criticized heavily for their moves. And so our reporting has found that there has been a move away writ large from humans looking at the content and making judgment calls and moderating to more automated systems. Maybe the platforms feel like they can’t really win.

Davey Alba: It has also been a difficult economic year with lots of layoffs across all of the Silicon Valley companies. This sort of content moderation doesn’t sort of add to investor value. There’s also more broadly a trend by governments, including our own government, making the case that these platforms should not engage in censorship. This is usually coming from Republican lawmakers in Congress. There’s all sorts of subpoenas and groups that have called on there to be less coordination between local governments and some of these platforms around election posts, things like that. There’s a lot of pushback on even the idea of misinformation which has become so politicized. And we’re sort of entering a scary new era with Twitter taking the charge into rolling everything back.

Ericka Cruz Guevarra: How would you maybe explain to someone like why it still matters what Elon Musk decides to do with this company? Because, I mean, I guess you could argue that no one has to be on Twitter or X, but why does it still matter?

Davey Alba: I think it matters because the reach of the posts is still enormous. And from my past reporting on this misinformation, be it, you can see that the problem is really a cross-platform one. Whether something starts on X and then spreads to Facebook or on conservative talk radio. These posts that carry false information are designed to elicit a specific reaction from people. Outrage, ideas of violence. And we’re already starting to see the consequences of some of these things.

Speaker 2: Thousands of people packing a mosque in Bridgeview to mourn the death of a six year old Palestinian American boy allegedly murdered because of his Muslim faith.

That boy’s name is Wadi Alpha Yumi. He and his mother were attacked in their place.

Davey Alba: There was a six year old boy who was killed after the landlord of the building he lived in, had spent hours listening to news on the radio about Israel and Gaza. And I think there is a risk of more tragedies like that. Users now carry a much greater responsibility to verify what they see on these platforms. All the knowledge that they have at their fingertips and the years of experience people have had from being on these platforms for, you know, a decade now. There’s this savviness that people have to rely on, and that’s just kind of the reality of where we are right now. I always think about misinformation as a community effort. You know, there’s not one NGO or one media company or one great CEO that’s going to solve the problem. It’s when everyone has an awareness of their civic responsibility and this good faith to try to make sure that everyone else around them is safe and informed and knows what is fact and what is fiction.

Ericka Cruz Guevarra: Well, Davey, thank you so much for sharing your reporting with us. I really appreciate it.

Davey Alba: Thanks for having me.

Sponsored

Ericka Cruz Guevarra: That was Davey Alba, a technology reporter for Bloomberg News. We’ll leave you a link to Davey’s reporting on this story in our show notes. This conversation with Davey was cut down by producer Maria Esquinca. It was scored by senior editor Alan Montecillo with music courtesy of the audio network. Additional production support from me. The Bay is a production of member supported KQED in San Francisco. I’m Ericka. Cruz Guevarra. Thanks for listening. Talk to you next time.

lower waypoint
next waypoint