Aller au contenu principal

Misinformation in the Israel–Hamas war


Misinformation in the Israel–Hamas war


Misinformation in the Israel–Hamas war refers to the dissemination of false, misleading or unsubstantiated information during the Israel–Hamas war. Much of the content has been viral in nature, with tens of millions of posts in circulation on social media. A variety of sources, including government officials, media outlets, and social media influencers across different countries, have contributed to the spread of these inaccuracies.

Overview

On Israel

A photograph was released appearing to show Major General Nimrod Aloni, the commander of the IDF Depth Corps, being held by Palestinians in the early hours of the attack. Hamas also claimed to have captured him. A Persian language post by the IDF quoted a post about his capture from Tasnim News Agency and wrote "Tasnim: Distributors of fake news of IRGC" without either denying or confirming the capture of Aloni. Aloni was subsequently seen on 8 October attending a meeting of top Israeli military officials.

Islamic Republic of Iran Broadcasting published images of the capture of commanders of Nagorno-Karabakh by the Azerbaijani army in September 2023 as the capture of Israeli commanders by Hamas.

A video of a CNN broadcast from near the Israel-Gaza border with audio added to suggest the network had faked an attack went viral on social media.

Social media accounts based in India have spread pro-Israeli disinformation, with influencers misrepresenting videos purported to show school girls taken as sex slaves, or Hamas kidnapping a Jewish baby. Fact-checker Pratik Sinha said the "Indian right-wing has made India the disinformation capital of the world". The trend forms part of a wider pattern of fake news in India with an Islamophobic slant, including disinformation on Palestinians coming from the BJP IT Cell, a vehicle of India's governing party, the BJP.

An Israeli boy and his sisters killed during Hamas's attack on Kibbutz Nir Oz on 7 October have been falsely accused of being "crisis actors".

A photo shared by Israel showing the charred corpse of a baby was claimed by many on social media to have been AI-generated, based on AI detector "AI or Not". The claim was repeated by Al Jazeera Arabic. The company behind "AI or Not" later said that the result was a false positive caused by the image's compression and blurred name tag; several experts who looked at the photo found it to be genuine. Other social media users claimed, based on a 4chan post, that the image had been altered from a similar photo of a dog, though researcher Tina Nikoukhah found that the dog picture was likely "falsified using generative methods".

The mostly ultra-Orthodox ZAKA volunteer paramedic and rescue group began collecting bodies immediately after the Hamas attacks, while the IDF avoided assigning soldiers with training to carefully retrieve and document human remains in post-terrorism situations. However, as part of the effort to get media exposure, Zaka spread accounts of atrocities that never happened, released sensitive and graphic photos, and acted unprofessionally on the ground, often mixing up remains of multiple victims in the same bag and creating little or no documentation about the remains.

Allegations of beheadings

Dead baby in oven claim

In a speech to the Republican Jewish Coalition on 28 October, Eli Beer, founder of an Israeli EMS organization, claimed that Hamas had burned a baby alive in an oven. The claim was repeated by journalist Dovid Efune, commentator John Podhoretz and others, in tweets seen over 10 million times. Israeli journalists and police found no evidence for the claim, and a representative of ZAKA, a first responder organization, said the claim was "false". Other first-responders reported that the event did happen, that a baby was found in an oven of a house that had been burned down by the attackers.

Sexual violence

On 9 January 2024, the Hamas rape report, titled 'Screams Without Words': How Hamas Weaponized Sexual Violence on Oct. 7, was removed from the scheduled The New York Times podcast channel, The Daily. The "veracity" of The New York Times story was undermined by the relatives of one of the victims, Gal Abdush; members of the Abdush family stated that there was no proof of rape and that The New York Times had interviewed them under "false pretenses". The article reportedly caused what The Intercept called a "furious internal debate" about the strength of its reporting, and according to Jeremy Scahill was met by skepticism from other New York Times journalists.

On Gaza

Inaccurate information

Viral claims that the IDF had destroyed Gaza's Church of Saint Porphyrius on 9 October were debunked by the church. However, Israel attacked the church on 19 October, killing 18 civilians.

In October 2023, disinformation experts uncovered an account on X that published false reports about Qatar threatening to cut off its gas exports if Israel continued to bombard the Gaza Strip.

Pro-Hamas accounts have misrepresented footage from the Syrian civil war as showing children being killed in Gaza.

In February 2024, Israel's official X account posted a 30-second video listing the humanitarian aid it claimed to have provided for Gaza. The video included March 2022 footage of a camp in Moldova for Ukrainian refugees that was falsely claimed to show tents and shelter equipment for Gazans. The video was subsequently deleted.

Impersonations

Following the Al-Ahli Arab Hospital explosion, an X account claiming to be an Al Jazeera journalist said they had video of a "Hamas missile landing in the hospital". Al Jazeera subsequently clarified that they were not associated with the account, and it was later removed. Another X account that promoted pro-Kremlin misinformation claimed The Wall Street Journal had reported that the explosion was caused by a Mark 84 bomb; The Wall Street Journal had not published such a report.

In November 2023, a video appearing to show a nurse at the Al Shifa hospital went viral. She claimed that she was unable to treat patients because Hamas had taken over the entire hospital and were stealing fuel and medicine, with the video ending with her pleading for all Palestinians to leave Al Shifa. Many were quick to point out the falsehood in the video, as none of the documented doctors and nurses at the hospital recognize the woman depicted, and a reported Israeli accent and inability to speak clear Arabic. Additionally, according to Esther Chan from RMIT FactLab CrossCheck, an analysis by open-source investigators had determined that the video was likely doctored to artificially include fake sounds of explosions.

Allegations of crisis acting

Video evidence of atrocities in Gaza has been frequently dismissed as acting, and people falsely accused of being "crisis actors", in order to downplay the killing of civilians by Israeli airstrikes. A derogatory and dismissive phrase, "Pallywood" is often used; the term is based on an unfounded fringe conspiracy theory that Palestinians are falsifying evidence of suffering. The fact-checking organisation Logically found that mention of the term has increased since October 7, particularly in Israel, the United States, and India. Evidence that was falsely used to "prove" Palestinians were crisis actors include a video of body bags which appear to be moving, which was instead a video of a 2013 protest in Egypt.

Saleh Aljafarawi, a Palestinian blogger and singer who lives in Gaza, was falsely accused by several pro-Israeli figures, including the country's official Twitter account, of being a "crisis actor". This included video of a Palestinian teenager wounded in a raid on Tulkarm in July 2023, who was falsely presented as "Saleh in a hospital days before October 27".

In November 2023, Israeli diplomat Ofir Gendelman circulated a clip from a Lebanese short film, claiming that it was proof that Palestinians were faking videos and calling it an example of "Pallywood". According to The Daily Beast, "Gendelman is a repeat offender when it comes to peddling misinformation about Palestinians". The previous week, Gendelman peddled IDF training videos as war footage, and in 2021, he was found by international media to have misrepresented 2018 footage from Syria as current footage from Gaza.

Dead children as dolls

A video showing a Palestinian child killed during an October 11 Israeli airstrike on Zeitoun has been falsely claimed to be staged using a doll. The claim has been promoted by official Israeli government social media accounts, including the X accounts of Israel's embassies in France and Austria, as well as pro-Israel and anti-Hamas accounts.

In early December, The Jerusalem Post published an article falsely claiming that a dead, 5-month-old Palestinian baby from Gaza was "a doll". The Jerusalem Post would later delete the article and remove any mention of it on their social media pages. Though not mentioning the article directly, they published a statement saying that "The article in question did not meet our editorial standards and was thus removed."

Sexual violence

On 25 March 2024, Al Jazeera took down its video of a woman named Jamila al-Hissi who said that Israeli soldiers had “raped women, kidnapped women, executed women, and pulled dead bodies from under the rubble to unleash their dogs on them” at Al-Shifa hospital in its latest siege. Former managing director of Al Jazeera, Yasser Abu Hilalah, wrote on X, “Hamas investigations revealed that the story of the rape of women in Shifa Hospital was fabricated.” Abu Hilalah reported that al-Hissi “justified her exaggeration and incorrect talk by saying that the goal was to arouse the nation’s fervor and brotherhood.”

IDF reliability as a source

Israel has released several pieces of incorrect or disputed information, leading to weakened credibility and online ridicule.

Writing for openDemocracy, British academic Paul Rogers stated, "Israel must maintain the pretence of an orderly war with few civilians killed. Netanyahu’s government is lying, but it would be naive to expect otherwise. Lying is what many powerful states routinely do, particularly in wartime." In The Intercept, investigative journalist Jeremy Scahill wrote, "At the center of Israel’s information warfare campaign is a tactical mission to dehumanize Palestinians and to flood the public discourse with a stream of false, unsubstantiated, and unverifiable allegations."

The Al Jazeera Media Network released a statement following an Israeli attack on several of its journalists in Gaza, stating, "Given Israel's unprecedented campaign against journalists, Al Jazeera urges media outlets worldwide to exercise the utmost caution and responsibility when headlining Israel's justifications for its crimes against journalists in Gaza."

Misinformation

On multiple occasions, analyses have found issues with IDF claims. In October 2023, a Financial Times analysis on a bombing of Palestinians evacuating Gaza City found that "most explanations aside from an Israeli strike" could be ruled out, though the IDF blamed the attack on Palestinian militants. In November 2023, analysis by the BBC found that video released by the Israeli military following the Al-Shifa Hospital siege had been edited. In March 2024, the Israeli army said it had "fired precisely" at individuals who posed a threat to soldiers during the Flour massacre; however, a United Nations team investigating the massacre's aftermath stated there was evidence of heavy shooting of civilians by the IDF.

In December 2023, an analysis by The Washington Post confirmed reports by Human Rights Watch that Israel had used white phosphorus in an attack on Lebanon, directly contradicting the IDF. In January 2024, after an Israeli airstrike killed journalist Hamza Dahdouh, the IDF called Dahdouh a "suspect" who was hit while driving with a "terrorist"; however, The Washington Post found "no indications that either man was operating as anything other than a journalist".

In November 2023, a video posted by the IDF showed Daniel Hagari, inside the Al-Rantisi Children's Hospital, where he claimed that the IDF had found Hamas weapons and technology, as well as a "list of terrorist names" in Arabic with the title "Operation Al-Aqsa Flood", showing each agents' rota guarding the hostages. However, a translation of the document showed that it contained no names but instead a calendar of the days of the week. After the questioning of the veracity of the claim, an Israeli spokesperson backtracked, but CNN, while removing the segment, did not provide an editors' note acknowledging the change or the dispute over the initial video.

In December 2023, Philippe Lazzarini, the head of UNRWA, stated Israeli government officials and media outlets were "creating a stream of baseless misinformation" related to humanitarian aid delivery in the Gaza Strip. A January 2024 Haaretz analysis found ZAKA had propagated misinformation about the 7 October attack. Osaid Alser, a medical resident at Texas Tech University who previously worked at al-Shifa Hospital and Nasser Hospital, stated, "When we talk about tunnels and all of that, I think this is Israeli propaganda that everybody should get used to at this point. Anybody who has worked in any of these hospitals, they can easily say this is just nonsense." Qatari prime minister Mohammed bin Abdulrahman bin Jassim Al Thani stated, "When it comes to the killing of Palestinians, we see that there is an abandonment of the basic principle of truth".

Unverified information

In other instances, the IDF simply released unverified claims as facts. In December 2023, Israel stated there was a Hamas tunnel network connected to the Al-Shifa Hospital; however, a report by The Washington Post found "There is no evidence that the tunnels could be accessed from inside hospital wards." In January 2024, reporting by CNN found there was no evidence to IDF claims about tunnels under a cemetery in Khan Younis. That same month, Israel claimed 12 UNRWA staff members had participated in the 7 October attack on Israel; however, the Financial Times, Sky News, and Channel 4 all stated that Israel's claims had no evidence to support them. In February 2024, the IDF claimed that Palestine Red Crescent Society paramedics had treated wounded Hamas fighters on 7 October; however, the video they released apparently did not show a Red Crescent ambulance. The same month, the IDF stated Hamas was stealing humanitarian aid, leading David M. Satterfield, a senior U.S. envoy, to say there was no evidence to support Israel's claims.

Disinformation

In November 2023, Israeli government accounts widely shared the website hamas.com, claiming that it belonged to the armed group. It was soon found out that the domain was purchased from Wix.com, an Israeli company, it was purchased recently and was completely in English. BBC Verify confirmed that it was a fake website. The website showed graphic videos of the October 7th attacks, "Hamas testimonials," statistics on Israeli casualties from the attack, including the false claim of 41 babies murdered and burned alive, and a link for donations. After questions by Ynet, David Saranga, head of the Digital Diplomacy Division at the Israeli Foreign Ministry, said that "The decision to purchase the domain supposedly belonging to Hamas is a sophisticated way to confront those who sympathize with Hamas and justify its atrocities."

On 4 December 2023, Haaretz reported on Israeli claims about beheaded babies, stating that these "unverified stories [had been] disseminated by Israeli search and rescue groups, army officers and even Sara Netanyahu". Haaretz journalists Nir Hasson and Liza Rozovsky related the chronology of the news items about "beheaded babies" and "hung babies" and concluded, "this story is false". They quoted Ishay Coen, a journalist for the ultra-Orthodox website Kikar Hashabbat, who admitted he made a mistake by unquestioningly accepting the IDF's statements. "Why would an army officer invent such a horrifying story?", Hashabbat asked, adding, "I was wrong". A ZAKA volunteer's story about a baby having been cut from a pregnant woman's womb was likewise found to be made up; in total only two babies are actually known to have died on October 7: one was struck by a bullet, the other was an injured Bedouin woman's baby that died in hospital shortly after birth.

"False flag" conspiracy theories

The October 7 attack by Hamas on Israel has become the subject of various conspiracy theories. These theories claim that the attack, which resulted in approximately 1,200 deaths in Israel, was a false flag operation conducted by Israel itself, despite the overwhelming evidence provided by multiple sources, including smartphone and GoPro footage capturing the breach of the border by Hamas forces.

This misinformation has been proliferating across various social media platforms, where hashtags linking Israel to "false flag" operations have seen a significant increase in usage. This spread of falsehoods was not limited to online spaces; it has manifested in real-world scenarios, including city council meetings and public protests, where individuals have publicly denied the facts of the attack.

Researchers and Jewish community leaders have expressed concern about the ties these conspiracy theories have to Holocaust denial and other antisemitic beliefs, with denial of the October 7 attacks described as part of a broader pattern of misinformation that seeks to distort historical events and promote antisemitic narratives.

Another unsubstantiated conspiracy theory that emerged following the October 7 Hamas attack suggests that the Israeli government, specifically Prime Minister Benjamin Netanyahu, had prior knowledge of the attack. This theory, which lacks any credible evidence, also includes the claim that Netanyahu issued a "stand-down" order to the Israeli military. The genesis of this theory appears to be from Charlie Kirk, a far-right influencer and supporter of former U.S. President Donald Trump. Kirk's comments on a podcast, suggesting a need to question whether there was a stand-down order, fueled these claims. However, these assertions have no factual basis and hinge solely on Kirk's personal speculations.

The claim rapidly gained traction on social media platforms like TikTok, Facebook, and X, formerly known as Twitter, particularly with the hashtag #BibiKnew, referencing Netanyahu. Despite its popularity online, there is no substantial evidence to support the notion that the Israeli government was aware of the Hamas attack in advance. The lack of proof notwithstanding, the theory has been influential in certain circles, especially among those critical of Netanyahu's leadership and Israeli policies.

Disinformation campaigns

A fake memo that purported to show Biden authorizing $8 billion in aid to Israel circulated on social media and was cited in articles by Indian news outlets Firstpost and Oneindia.

According to information security experts interviewed by the New York Times, Iran, Russia, China, Iran's proxies, Al Qaeda and the Islamic State have been conducting massive online disinformation efforts focused on "[undercutting] Israel, while denigrating Israel's principal ally, the United States". Researchers have documented at least 40,000 bots or fake social media accounts, as well as strategic use of state-controlled media outlets like RT, Sputnik and Tasnim. An analysis by Haaretz found that hundreds of fake accounts on social media were targeting Democratic Party lawmakers with spam messages repeating Israeli government accusations relating to UNRWA and Hamas.

A Russian disinformation campaign known as Doppelganger has pushed false information about the war using fake websites that mimic the appearance of news sources such as Fox News, Le Parisien and Der Spiegel.

In February 2024, Volker Turk, the UN human rights chief, stated that the United Nations had been the subject of disinformation attacks, saying, "The UN has become a lightning rod for manipulative propaganda and a scapegoat for policy failures".

Fake videos

Videos falsely linked to the war included a video of children in cages posted on 4 October, footage from 2020 of Iranian lawmakers chanting "Death to America", and in Egypt, photos of the Cairo Tower appearing to be lit with the Palestinian flag spread on social media, which turned out to be a modified version of the tower in 2010. Footage from video game Arma 3 has been presented as war footage.

On October 8, a video supposedly of Hamas thanking Ukraine for supplying them was shared by an X account linked to the Wagner Group. It was viewed over 300,000 times and shared by American far-right accounts. The next day, former Russian president Dmitry Medvedev tweeted, "Well, Nato buddies, you've really got it, haven't you? The weapons handed to the Nazi regime in Ukraine are now being actively used against Israel."

Social media users on both sides of the war shared behind-the-scenes footage of an actor lying in fake blood from a 2022 Palestinian short film, alleging it was evidence that the other side was creating propaganda. A video of Egyptian paratroopers flying over the Egyptian Military Academy that was falsely claimed to show Hamas militants infiltrating an Israeli music festival went viral on X in Indonesia.

Indian Twitter accounts spread an out-of-context video claimed to represent "dozens of young girls taken as sex slaves by a 'Palestinian' fighter", which was instead actually probably a school trip to Jerusalem. Another clip primarily shared by Indian users was purported to depict a kidnapped baby; however, the video was taken a month earlier and had nothing to do with Gaza.

An AI-generated video of model Bella Hadid supposedly apologising for her past remarks and expressing support for Israel circulated on social media.

Collection James Bond 007

Role of social media platforms

Disinformation about the war has spread on social media platforms, particularly X (formerly known as Twitter). The European Union warned Elon Musk and Mark Zuckerberg that X and Meta were hosting disinformation and illegal content about the war, with potential fines of up to 6% of the companies' global revenue according to the Digital Services Act.

In response to the reports, X's CEO Linda Yaccarino told EU internal market commissioner Thierry Breton that it had "taken action to remove or label tens of thousands of pieces of content" and removed hundreds of accounts linked to Hamas.

According to NewsGuard, "at least 14 false claims related to the war garnered 22 million views across X, TikTok, and Instagram within three days of the Hamas attack". On 13 October, the EU opened an investigation into X about the spread of disinformation and terrorist content related to the war.

On 14 October, Center for Countering Digital Hate CEO Imran Ahmed said his group was tracking a spike in efforts to push false information about the war, adding that U.S. adversaries, extremists, Internet trolls and engagement farmers were exploiting the war for their own gain. Graham Brookie, senior director of the Atlantic Council's Digital Forensic Research Lab, said that his team had witnessed a surge in terrorist propaganda, graphic content, false or misleading claims and hate speech, with much of the content being circulated on Telegram. Cyabra, an Israel-based company that analyses social media, said that one in five accounts taking part in conversations about Hamas' attacks were fake, adding that they had found approximately 40,000 such accounts on X and TikTok.

According to the New York Times, many images and videos that circulate on social media pretending to be from the Israel–Hamas war are in fact from other conflicts, such as the Syrian civil war; and even of natural disasters, such as a recent flood in Tajikistan.

According to AP's David Klepper, "pictures from the Israel-Hamas war have vividly and painfully illustrated AI's potential as a propaganda tool, used to create lifelike images of carnage... digitally altered ones spread on social media have been used to make false claims about responsibility for casualties or to deceive people about atrocities that never happened."

X (formerly Twitter)

On 9 October, X said there were more than 50 million posts on the platform about the conflict. Musk recommended two accounts that previously promoted a false claim about an explosion near the Pentagon for updates about the war.

On 10 October, researchers found that a network of 67 X accounts was coordinating a campaign of pushing false information about the war.

According to Wired, the community fact-checking system of X, Community Notes, has in some instances contributed to the spread of disinformation instead of correcting it. Wired cited an incident where a video uploaded by Donald Trump Jr. of Hamas shooting at Israelis was inaccurately tagged as a false video from several years ago as an example of the unreliability of Community Notes. Fake accounts pretending to be a BBC journalist and The Jerusalem Post promoted false information about the war prior to X suspending them.

On 12 October, the Technology Transparency Project reported that Hamas was using premium accounts on X to push propaganda. X said it has banned Hamas and removed hundreds of accounts affiliated with Hamas.

On 13 October, on The World radio program, Rebecca Rosman reported that disinformation on X was being monetized by paid-verified users with "new-content" recommendation preference, resulting in millions of views.

According to a report by NewsGuard on 19 October, verified users on X were behind 74% of the 250 most-engaged posts between 7 and 14 October that promoted false or unsubstantiated information about the war. NewsGuard also found that only 79 of the 250 posts were flagged by Community Notes.

On October 28, commentator Jackson Hinkle posted on X that Haaretz had reported that the Israeli government inflated the death toll for the 2023 Hamas attack on Israel. Haaretz stated that Hinkle's post "contain[ed] blatant lies" and was not substantiated by their reporting on the attack. Hinkle also claimed that the image of a Jewish baby burned alive by Hamas on October 7 "was created by artificial intelligence." He was subsequently deplatformed from YouTube.

Syrian YouTuber Maram Susli claimed that footage showed Israeli military helicopters firing on Israelis escaping the October 7 massacre at the supernova festival, carried by Hamas. However, footage resulted to be from Israeli attacks on Hamas positions in Gaza three days later. She also posted a photograph of a woman carrying a child's toy car down the stairs of a largely destroyed building suggesting it was Gaza after Israeli attacks. The picture was actually an award-winning photograph taken in Homs during the Syrian civil war.

An investigation by ProPublica and Columbia University's Tow Center for Digital Journalism found that verified accounts promoting misinformation about the conflict saw their audience grow significantly during the first month of the conflict, and that Community Notes had failed to scale sufficiently, with 80% of the debunked tweets reviewed not being clarified with a note.

TikTok

On 12 October, the EU warned TikTok about illegal content and disinformation on its platform. On 15 October, TikTok said it had taken action to remove "violative content and accounts". It also said it had established a command center for the conflict, updated its automated detection systems to detect violent content and added moderators who speak Arabic and Hebrew. A TikTok video promoting conspiracy theories that Hamas's attack had been orchestrated by the media was viewed over 300,000 times.

By mid-November 2023, Republican U.S. Representative Mike Gallagher had claimed that TikTok was "intentionally brainwashing" American youth into supporting Hamas, citing the spike in pro-Palestinian content following the outbreak of hostilities between Israel and Hamas. In response to criticism, TikTok issued a press release on 20 November asserting that younger Americans, particularly Millennials and Generation Z, tended to be more sympathetic to the Palestinians than to Israel, citing Gallup polling data dating back to 2010. TikTok also claimed that its algorithm did not take sides but operated in a positive feedback loop based on user engagement. The company also denied favouring "one side of an issue over another" or intentionally promoting pro-Palestinian hashtags such as "#freepalestine," which had attracted 25.5 billion views by November 14. By comparison, "#standwithisrael" had attracted 440.4 million views. TikTok's press release also stated that it had removed 925,000 videos related to the conflict for violating community standards, including promoting Hamas, had hired moderators fluent in Arabic and Hebrew to parse content, and begun removing fake accounts created in response to the Israel-Hamas conflict.

According to a TheMarker report, neo-Nazi propaganda, antisemitic content, and calls for the destruction of Israel were all circulating on TikTok throughout the war.

Telegram

The Al-Qassam Brigades, Hamas's military wing, had around 200,000 followers on Telegram at the time of Hamas's attack. According to the Digital Forensic Research Lab, its following has tripled since then, with its posts being viewed over 300,000 times. The Digital Forensic Research Lab found that Hamas relies on Telegram to send statements to its supporters.

According to political analyst and researcher Arieh Kovler, many Israelis follow official-sounding Telegram channels that share out-of-context videos and unverified rumors.

In a statement, Telegram said it was "evaluating the best approaches and... soliciting input from a wide range of third parties" and that it wished to be "careful not to exacerbate the already dire situation by any rush actions".

Impact

In November 2023, Center for Countering Digital Hate CEO Imran Ahmed said that misinformation about the war was as difficult to track as COVID-19 misinformation and misinformation about the 2020 United States presidential election.

In January 2024, McDonald's CEO Chris Kempczinski said, "Several markets in the Middle East and some outside the region are experiencing a meaningful business impact due to the war and associated misinformation that is affecting brands like McDonald's". The boycotts started after McDonald's Israel announced it had donated free meals to IDF soldiers involved in the war.

In February 2024, Bellingcat founder Eliot Higgins said, "I think the intensity of online discourse around Israel and Palestine is really kind of much worse than I've seen in any of the conflicts. People are not looking to establish the truth in many cases, but basically just look for things to bash each other over the head online. It's really just about people arguing their positions, their opinions, and not really establishing the exact truth around what's happening."

Speaking about Israel's decision not to allow foreign journalists into Gaza, UN secretary-general Antonio Guterres stated, "Denying international journalists entry into Gaza is allowing disinformation and false narratives to flourish." The technology director of the Institute for Strategic Dialogue stated, "The corrosion of the information landscape is undermining the ability of audiences to distinguish truth from falsehood on a terrible scale."

See also

  • Denial of the 7 October attacks
  • Pallywood
  • Media coverage of the Israel–Hamas war
  • Disinformation in the Russian invasion of Ukraine

References

External links

  • A flood of misinformation shapes views of Israel-Gaza conflict. Washington Post.
  • BBC expert on debunking Israel-Hamas war visuals: "The volume of misinformation on Twitter was beyond anything I've ever seen"

Text submitted to CC-BY-SA license. Source: Misinformation in the Israel–Hamas war by Wikipedia (Historical)