How the Gaza War Is Reshaping Social Media

As reports of Gaza censorship on Instagram and Facebook raises alarms, Congress targets TikTok while X profits from government surveillance.

Social media apps on a phone seen on Feb. 28, 2024. Illustration: The Intercept/Press Association via AP Images

Meta — Facebook and Instagram’s parent company — refuses to provide evidence refuting widespread reports that it’s censoring Gaza-related content on its platforms. This week on Deconstructed, technology reporter Sam Biddle joins Ryan Grim to discuss his recent reporting on the efforts of Sens. Elizabeth Warren, D-Mass., and Bernie Sanders, I-Vt., to press Meta for specifics.

Grim and Biddle dig into debates blaming the horrifying images coming out of Gaza for turning young people against the war. “When people see images of horrific bloodshed,” Biddle says, “when they see bodies blown apart by bombs, that’s upsetting to most people. There doesn’t have to be any ideology attached.” They also dive into how pressures to sanitize Israel’s war is being used to ban TikTok, and how X, formerly known as Twitter, is profiting off of government surveillance.

[Deconstructed theme music.]

Ryan Grim: Welcome to Deconstructed. I’m Ryan Grim.

Social media has played a really interesting and unusual role in the way that we’ve understood the conflict between Israel and Palestinians in Gaza over the last several months, in a way that reminds me, in some ways, of how the advent of Twitter and Facebook in the 2010s really shaped the way that the world understood the Arab Spring, and how people at the time were able to connect. And then, also, how the dictators that faced the consequences of the ability of those people to connect started to blame Twitter and Facebook on the fact that they were now getting toppled.

And so, I say all that to preface the conversation that I’m going to have today with my colleague, Sam Biddle, who reports for us on technology, with a focus on surveillance, privacy, corporate power, and all other things related to tech and power.

Sam, welcome to Deconstructed. Thanks for joining me here.

Sam Biddle: Thank you so much for having me on.

RG: So, we’ve got this really interesting situation that has developed where, on the one hand, you’ve been doing a ton of really fascinating reporting on the way that social media platforms, these social media giants, have been censoring users kind of in a pro-Israel direction. Yet, at the same time, that’s overlaid by this fact that, the more somebody is on social media — and the younger they are, in some ways — the more likely they are to sympathize with Palestinians in this conflict. The more likely they are to get a more accurate insight, more accurate window into what’s going on on the ground, despite the fact that all of this censorship and push in one direction is going on. And maybe that’s simply because the corporate media here in the U.S. is just so, so slanted, that even any window into what’s going on the ground creates that change.

You’ve been doing a lot of work on Facebook in particular. How would you say that they responded in the first couple of weeks after October 7th?

SB: So, Meta, which is the parent company of Facebook and Instagram, controls the speech of billions of people around the world, including Israel and Gaza. It was really striking to see how little had changed since the last time there was a prolonged period of intense Israeli military action against Palestinians, Israeli state violence against Palestinians, in 2021. There was a lot of debate and controversy around the way Meta handled itself in 2021; namely, the suppression and abrupt deletion, and general, I think what’s fair to call censorship of information about that violence.

In the aftermath of that, Meta made a lot of promises to various civil society groups and the general public about lessons learned, and how it had— You know, there were some mea culpas. But then, fast forward a few years and it was essentially the exact same: posts disappearing, people having their accounts locked without explanation, in a very, I think it’s fair to say, slanted — as you put it — way. Most of this suppression intentionally or otherwise ended up affecting Palestinians, or people sympathetic to Palestinians.

And so, yeah, it just really seemed like just a total rehash of the past. So, I think very little has meaningfully changed since the last time Meta was in a position to control people talking about a war.

RG: The ubiquity of the censorship is what has always struck me about. It’s like, almost anybody who you spoke to or who you speak to about posting on social media — particularly Facebook and Instagram — have stories of their posts getting throttled, taken down, their account suspended. Often with no explanation, no obvious reason.

I think even the watermelon emoji that people started using. It had some connection, but it was also a way to kind of get around the censors. People were just coming up with all of these different codes to try to make sure that they weren’t getting caught in these nets, but they didn’t know how the nets were designed.

So, what did the civil society groups say in the first couple weeks after October 7th? Because you now have two years, three years of Meta saying that, we hear you, you’re valid, we won’t do this again. And then, all of a sudden, it seems like the exact same thing is going on. So, what response did they get back?

SB: To meta, I think it was just sort of a, are you kidding me? Are we really having these exact same conversations again after the assurances you gave us the last time, and the time before that? I mean, the groups that I speak to are very frustrated for obvious reasons, but I think that they see themselves as taking their relationship with Meta in a kind of stakeholder/consultative role as something they take very seriously and put a lot of work into. And I think they do not always feel as if Meta is taking it as seriously and, really, to the extent they’re listening, not really doing anything with the feedback they get from these groups. As evidenced by the fact that, like I said, the same things were happening all over again.

Back in 2021, you had widespread removal of content attributed to breaking the rules. And then, you also had widespread disappearance of content with no apparent explanation, that Meta later blamed on technical errors.

After October 7th, the exact same thing. A combination of things being taken down for breaking the rules, whatever those rules may be, and then just inexplicably vanishing, that the company later says, oh wait, sorry, that was a problem. Or, not always just disappearance of content but, for instance, there was, I think, maybe in November of last year, 404 Media reported that certain phrases in Arabic in Instagram profile text was being translated to the word “terrorist,” in the context of a Palestinian user. Which the company then said, oh, sorry, that was just a machine learning translation error.

But that response, I think, is really emblematic of Meta’s approach. It’s a very, I think, problematic combination of the system working as intended, and the system being broken. And when you apply a system that is that dysfunctional to the speech of billions of people, you’re going to get many, many, many, many false negatives, false positives, however you want to categorize it. But the end result is censorship, whether it’s intentional, or a byproduct of Meta using stuff that doesn’t work.

RG: Despite all that, Instagram and TikTok really changed the way that the world understood this conflict, and now understands it to be a genocide because we — and I say the collective “we” around the world — have been able to watch it unfold through Instagram and TikTok, despite all that.

You had people like Motaz, Bissan, Hind Khoudary, reporters who would go direct to Instagram, direct to TikTok, in ways that were very familiar to people. You know, starting out with a, “hey guys.” Except instead of “hey guys, like, here’s my makeup routine,” it’s like, “hey guys,” and you can hear the thudding of the bombs around them.

How did Facebook and, and Instagram respond to that? There was a lot of pushback at the same time from pro-Israel forces that were saying, this is unfair, you’re indoctrinating kids by giving these direct linkages, creating these parasocial relationships.

So, what was it like for Facebook now to be kind of caught between both sides? Because now, not only do you have the pro-Israel lobby wanting to straight-up ban TikTok, they’re also livid about how they’ve lost control on Instagram.

SB: Yeah. I think that Meta realizes they can’t. And, in fairness, nor is there any indication this is something they would want to do. There is no will to censor depictions of the war. Where you start to see trouble is in the edge cases, in the gray areas.

An example that comes to mind is, immediately after the Russian invasion of Ukraine, Meta reminded all of its moderators that images of Russian airstrikes, even if they are very, very gruesome and graphically violent, things that would typically be potentially subject to removal, should be preserved, because they have news value. It wasn’t a carveout, but it was a specific reminder to preserve documentation of Russian atrocities.

No such directive or memo or anything like that was ever provided in the aftermath of October 7th about Israeli airstrikes. So, there’s been no effort on Meta’s part whatsoever to institute some sort of blackout about this, but they, to say the least, have not taken the kind of proactive steps they did around Russia and Ukraine, and I think that is where you start to get into some really concerning areas here.

The thing about TikTok and Instagram, I think, is really fascinating. Because so much of the hysteria around the indoctrination of youths treats these platforms as if there’s some sort of sophisticated brainwashing operation, when I think it stands to reason the explanation is pretty simple. When people see images of horrific bloodshed, no matter whose blood is being shed, people don’t like that. When they see bodies blown apart by bombs, that’s upsetting to most people. There doesn’t have to be any ideology attached to that or any indoctrination attached to it. People see that happening and think it’s awful.

As much as the Palestinian perspective does get suppressed, enough is getting through, right? Thanks, in large part, to some of the journalists that you mentioned. Enough is getting through that the facts on the ground are undeniable, and just viscerally disturbing to anyone who sees them.

RG: Tell me if you think this is right, but what seems to be unique about this conflict here is the way that people were able to develop these parasocial relationships with the reporters who were themselves under threat. The rise of TikTok and Instagram as real sources of news for people is somewhat recent. It’s around during Russia’s invasion of Ukraine, and I think if, let’s say, the assault on Kyiv had lasted longer than a couple of days — it was repelled very quickly, and then you wind up getting this trench warfare, and these offensives and counteroffensives —  if the assault on Kyiv had lasted longer, I think you would have developed those same parasocial relationships with Ukrainian journalists or citizen journalists, who would have been broadcasting from their apartments and creating those connections.

Whereas once it moved to more trench warfare and more traditional warfare, then you had plenty of journalists who are on the front lines, but, A, people were trying not to use phones, because the second that you would use a phone, someone would figure out where you were, and you had instances of that. Russians were doing it, Ukrainians were doing it, and then, boom. So, it’s not worth it.

But also, those are reporters who kind of traveled to the front lines, and it’s a more traditional type of setting. Whereas these reporters, these were just people who were normal people on October 6th; some of them working in journalism, some of them not, at the moment, working in journalism. And then, October 7th, 8th, 9th, they have nowhere to go. It’s not as if they’ve thrust themselves into this for some thrill-seeking, or even professional reasons. Just, they’re there, and they’re telling you about their life.

SB: They’re trapped there. I mean, not to at all diminish the bravery that it requires to document something like that, but they don’t really have a choice, right? I mean, they’re literally trapped in this war. Whereas, right, as you point out, to the extent that there are reporters covering the war in Ukraine now, they’re embedded, and there’s a formal process there.

I mean, a lot of the footage coming out of Gaza is just people wandering around, because they have nowhere else to be. It’s an extremely densely populated urban area, whereas the fighting between Russia and Ukraine is happening in, I think, at this point, completely depopulated areas, where there’s no one there to see any of it, except the people doing the fighting, and maybe a handful of embedded Western reporters.

That’s something that’s really interesting. You brought up the Arab Spring earlier, that phase of thinking about social media. And, as much as I agree that the footage and imagery coming out of Gaza has been extremely powerful, I do think that it challenges some of the premises of Arab Spring discourse. A lot of that was basically sort of digital democracy stuff; if you give people access to Twitter, you will overturn dictators. Like, the technology is itself powerful enough to cause change, right? To cause actual political material change.

It’s hard to imagine footage coming out of Gaza, of Israeli airstrikes and other Israeli assaults, that are more disturbing than what we’ve already seen, right? We’ve already basically maxed out the capacity for horror, right?

But the war is still going on. There doesn’t seem to be any real challenge to the prosecution of that war. Popular support for it in the U.S. might ebb and flow, but the government’s commitment to supporting that war, material commitment, hasn’t changed as far as I know an iota. So, I do think that this really undermines a lot of that Arab Spring optimism, that you could post your way to change.

And, again, this is not to diminish at all the efforts of the people doing it.

RG: In some ways, sometimes, dictators are easier to overthrow than democracies.

SB: Sure.

RG: Some of those pre-Arab spring dictatorships turned out to be extraordinarily fragile. You get enough people into the streets and the people at the top turn on them, and they’re on a plane, and they’re out of there, where they’re dragged through the street.

Whereas, in the U.S., I think it clearly has had an effect on public opinion. We just had a Gallup poll come out this week — that was the first since the first Flower Massacre — that found, I think, 75 to 78 percent of Democrats to only 18 percent opposed; basically, about 78 percent of Democrats opposing Israel’s war effort. Overall, it was like 55 percent of the entire public against, even Republicans, something like 30-plus percent had turned against it. And you see the numbers wildly fluctuating, depending on your age. And it seemed like, the more you’re seeing, the more you’re against it.

Now, the fact that that hasn’t led to policy change is a separate question.

SB: Right. Yeah. I think it says more about the disconnect between public opinion and policy, which isn’t a social media thing. But it does remind me of how there was so much hope among technologists that social media would be what it takes to make changes in public opinion history. And maybe they are, in their own way, but the extent to which public opinion about this war has been shaped by documentary, unimpeachable evidence, and yet nothing changes, has been, I think, really revealing and disillusioning, in a lot of ways.

I mean, you can have all the evidence in the world right in front of you, and so, what? I mean, at the end of the day, nothing is happening. I think, in a way, it might be the official end of that kind of Arab Spring idealistic thinking about social media specifically. Not to suggest that people should stop caring about this stuff, but the divide between what people want and what people think from the people they put into power and what those people do, I cannot remember an example of it being more stark.

RG: At the same time, I think there’s another interesting parallel to the Arab Spring, because those in power do still seem to feel threatened by this relationship that people have developed with the people they’re seeing destroyed on social media. And you can tell that, by the fact that they’re trying to ban TikTok.

You see all this energy put into this argument that the Chinese are manipulating young people into hating Israel’s war effort here, like that’s what’s going on here. It’s not what you think it is, it’s actually that they’re being brainwashed into it.

So, tell me a little bit about TikTok’s relationship to Palestinian descent as well. Because, over the years, I’ve also heard either Palestinians or people who are supportive of the Palestinian cause, that they too have had enormous difficulty posting without getting nuked.

So, how have we gotten to a place where, despite that, it’s still a threat big enough that they could get the house to pass a bill to either sell or ban it?

SB: Yeah. I think that the anti-TikTok movement has become a pretty big tent. I think it is, at this point, largely a movement of neo-Cold Warrior types, China hawks, people who have a material interest in hostility with China, short of warfare — worse than friendly, not as bad as going to war, although I’m sure a lot of them would be very thrilled with a shooting war with China —  a union of that branch, which is very old, and political Zionism.

I think a lot of staunch supporters of Israel and the current war are scapegoating TikTok. I mean, it’s very hard to tell to what extent people who say TikTok is brainwashing children into hating Israel, or TikTok is brainwashing our college students into being Hamas guerrillas, it’s hard to tell how much they actually believe that, versus how much they resent TikTok for sharing images of dead Palestinians with young Americans,

RG: The argument feels as absurd as we heard back during the Arab Spring because, back then, you’d hear Mubarak and his supporters saying, the problem here is Facebook and Twitter, and if it weren’t for them, I’d be fine. Iran saying the same thing the year before that. OK, but everything that they’re saying is true about you. They’re not making anything up. It’s just that they’re able to now share it more freely.

And so, to see Americans making that same complaint almost 15 years later, when the same is true. Like, nothing they’re saying is untrue, it’s just that people have the freedom to kind of share the things again. It’s like, you sound like Mubarak. You sound like a tinpot [dictator].

SB: Also, the great irony here is, who bans social media platforms because they present a national security threat to the populace? China, right? I mean, that was supposed to be one of the great differentiators between how our societies treat information. Which is, if you don’t like it, that sucks, but you let it happen, because the alternative is tyrannical.

Yeah, there a great irony, but I think that it is, if you are a supporter of the Israeli war effort, you can’t say, we can’t allow young people to see images of dead Palestinians — as much as I’m sure that is the underlying desire — but you can say we want to ban scary Chinese communist software that is allowing people to see those images.

To your point before, yeah, plenty of people — not exactly the same demographic cohort — but plenty of people are seeing the exact same stuff on Instagram, or on Twitter, or on Telegram — a Russian app. But you don’t see the same legislative push, obviously, to ban those platforms. I think that the resentment towards TikTok because it is Chinese gave those who resent it because it allows unfettered access to documentation of atrocities. There’s something very convenient there, right? It’s like, here’s a mega-popular app that is undermining support for this war. The good news is, there’s already this built in movement to ban it, because it’s Chinese, so I think there’s been sort of a bandwagon effect there.

But those two lobbies combined are very powerful. Whether they’re powerful enough to actually push the legislation through all the way, I don’t know, but they’ve made it this far.

RG: You had a story, recently, about this guy Jacob Helberg, who was kind of one of the leading advocates of the TikTok sale/ban. And you reported that not only is he a member of the U.S.-China economic and security review commission, but he also works for Palantir — a Peter Thiel-linked tech firm — that really benefits from U.S.-China conflict escalating. What was Jacob’s role in this?

SB: Predating his employment at Palantir, he was a long time China hawk. The Wall Street Journal had a couple very illuminating reports about this — one from early March, I believe, and one from last year — about his effort to just basically whip up support for a ban. I think the journal said that he had met with something like a hundred different legislators. He advocated for banning TikTok to all of them.

If you take him at his word, his view is that TikTok represents a genuine national security threat to the United States because it is, in fact, not a social media platform, it’s a Chinese military intelligence operation meant to do what, exactly? Hard to say, because there’s no proof.

RG: Spread makeup tips.

SB: Yeah, there’s no evidence for any of this, you sort of have to use your imagination, I guess. But the difference is that now he works at Palantir, which markets itself increasingly as a necessary counter to Chinese military power.

I think it is very fair to assume that any escalation in tensions between the United States and China benefits military contractors who pitch themselves as counters to Chinese military power. That is just how defense contracting works. That’s not a conspiracy theory or even, really, controversial. War, or the approach of war or hostilities, even, are good for the defense industry. That has always been true, it always will be true. But what you have with Jacob Helberg is, someone who is essentially pushing legislators to dial up tensions between the United States and China while he is an employee of a company that profits from deteriorating relations between the United States and China.

Now, look, this is not anything remotely unique to Jacob Helberg, or Palantir, or TikTok. People in these conflicted positions have advisory roles with the government all the time, same as it ever was. But I focused on that in my article just because the fact of his employment at a defense contractor has been — I mean, he doesn’t hide that fact — but the fact of it is essentially absent from discussion of his role, which I thought was relevant to that role, to say the least.

RG: And, since the House passed that legislation that would require it to be spun off to a non-Chinese buyer or band within, what, 180 days or so? Steve Mnuchin came out and said that he was putting together a group of investors. We know that Mnuchin’s investors in his firm include the Saudis, the Emiratis. At one point, before he retired, the Mossad spy chief was reported to be talking to Mnuchin about a major investment in his firm.

Has that put the brakes a little bit on this? Like, there was a couple of days of stories of saying, like: Wait a minute, we’re going to force China to sell it to Saudi Arabia, who put Pegasus spyware on an American journalist’s wife’s phone, and then chopped him to pieces?

SB: Right. I think, in a lot of cases, U.S.-Saudi relations expose the hollowness of a lot of stated American values. I mean, Saudi Arabia owns a chunk of Twitter. I have not seen — maybe he has, and if he has, I apologize — but I’m not aware of Jacob Helberg and his fellow China hawks making a stink about that. For, I think, reasons that are pretty clear.

I think China is a very effective boogeyman right now, and that doesn’t require denying or whitewashing Chinese human rights abuses. But they are a very helpful adversary right now if you are in the business of defense, intelligence, weapons, now including AI. I think the only anti-China voice in Silicon Valley louder than Peter Thiel is arguably Eric Schmidt, who also has an enormous vested interest — financial interest — in the deterioration of U.S.-Chinese relations.

The Saudi presence, I think, just reveals a lot of this is a put-on.

RG: So, back to Meta, you reported that, what, Elizabeth Warren and Bernie Sanders, I believe, reached out to the company to ask them what was going on? Because, you know, civil society groups don’t have any luck, users don’t have any luck, so maybe a couple of senators could get some answers.

What did they ask, and what did they find out?

SB: In December of last year, Senator Warren sent a letter to Meta asking something like a couple dozen different, very specific questions about content moderation pertaining to the war in Gaza. The marquee questions were about how much speech has been removed, broken [down]— and this was crucial — how much of it has been in Hebrew, how much of it has been in Arabic?

So, asking for specific figures about censored speech, whether it was a genuine violation of some rule or whether it was an inadvertent takedown, or whatever. Asking for concrete figures, broken down by language, because the point of the letter was to get documentation about prejudicial content moderation. Essentially, the idea that Meta is treating Palestinians and those sympathetic to them differently from Israelis and those sympathetic to them.

RG: What did they find?

SB: Well, they found nothing. Meta replied with a letter that I actually noticed a lot of it was just copy and pasted from a press release they put out in October. The only new information that I could see in the letter was that Meta disclosed that they had removed 2.2 million posts in the week after October 7, but that was not broken down by language. They totally punted on the underlying question of, are you treating Palestinian speech different from Israeli speech, or speech about Palestinians different from speech about Israelis and about Israel, and about the war? Is there discriminatory content moderation happening here? And they just ignored it. They didn’t answer any of the questions.

Now, Senator Warren, joined by Sanders, they’re basically sending the same letter again, saying, we’re giving you another shot here, one more time. Can you answer our actual questions? We’ll see. I would be surprised if they received specific answers.

RG: Where is all this headed?

SB: As far as Meta is concerned, I think it’s headed nowhere. They are not obligated to divulge that information just because someone asked nicely. You can draw whatever conclusions you want about why they may not want to divulge that information. But there are a lot of civil society groups around the world that have documented many cases of speech that is critical of Israel or sympathetic to Palestinians being treated differently than speech that is not.

Only Meta, however, has the motherlode of data that could demonstrate the scale of this discriminatory enforcement. I mean, again, there’s no reason that I’m aware of to doubt the work that’s being done by groups like Human Rights Watch, and 7amleh, and many others to document this kind of stuff but, at the end of the day, Meta can say, OK, well, that’s anecdotal, or that’s circumstantial. You know, you’ve got a thousand examples of biased content moderation, but you don’t have the whole picture. Only they have the whole picture.

What we know about large internet companies is that they do not voluntarily disclose information about how they enforce their policies unless they have to. Meta puts out a transparency report about the ballpark figures of the number of pieces of content they’ve removed, but nothing about the specific mechanism by which it was removed, or why. We know the what, but not the why, and I think that’s what Sanders and Warren are trying to get at.

But, you know, good luck. I have a feeling they will not be able to get much out of them.

RG: While I’ve got you here, last question. Talk to me a little bit about this story you did this week about Elon Musk, how he’s been kind of publicly fighting government surveillance, but apparently profiting off of it at the same time.

SB: Yeah, and I think it’s very important to preface this by saying this is a relationship that predates Elon Musk’s purchase of Twitter by many years. But, back in 2014, Twitter sued the government so that they could say exactly how many national security letters they had received.

National security letters are a way for the government to compel companies like Twitter, or Google, or whomever, to turn over private customer information. So, the government could go to Twitter and say, hey, we want this person’s Twitter DMs, and it’s a matter of national security. Also, you are gagged, and can’t disclose the fact that we asked you for this, and forced you to give it to us.

Twitter, I think rightfully, said, this isn’t good, we need to at least be able to tell people this is happening. Under Musk, they tried to take it all the way to the Supreme Court. In a petition to the court, Musk’s lawyers — Twitter or X’s lawyers — said in no uncertain terms that government surveillance of electronic communications is prone to abuse, and is something the public needs to be informed of.

While this is all happening, Twitter essentially leases all of the information, all of the public information on its platform to a company called Dataminr, which then sells it to law enforcement agencies all across the country, and the DOD. And, as my reporting has demonstrated, police departments, including federal police, use Dataminr to spy on Black Lives Matter protests, abortion rights rallies, and other First Amendment-protected activities.

So, it would seem to be the exact kind of abuse that Musk’s lawyers were talking about in their petition to the Supreme Court, which the Supreme Court declined. They declined to hear the case, so that is sort of dead.

But I’m curious what you think. I think that Musk has really positioned himself in a way that previous Twitter executives did not, as a kind of heterodox government skeptic, right? Like, he doesn’t trust the government, he thinks that you shouldn’t either. He’s suspicious of U.S. foreign policy and police powers but, you know, then, at the same time, he is a major defense contractor through SpaceX, and also a surveillance vendor through X and Dataminr.

So, there are inconsistencies in the man’s ideology, to say the least.

RG: The height of the level of conspiracies that he’s willing to accuse the American government of you would think would rule out, then, selling all of your personal information to that very same government.

SB: One would think. Yeah.

RG: You would be wrong.

SB: You sure would.

RG: Well, Sam. Thank you as always. I really appreciate you joining the show.

SB: My pleasure.

RG: All right, that was Sam Biddle and that’s our show.

Deconstructed is a production of The Intercept. This episode was produced by Laura Flynn. The show is mixed by William Stanton. Legal Review by Shawn Musgrave and Elizabeth Sanchez. Leonardo Faierman transcribed this episode. Our theme music was composed by Bart Warshaw.

If you’d like to support our work, go to theintercept.com/give. And, if you haven’t already, please subscribe to the show so you can hear it every week. Please go and leave us a rating or a review. It helps people find the show.

If you want to give us additional feedback, email us at podcasts@theintercept.com.

Thanks for listening, and I’ll see you soon.

Join The Conversation