Skip to content

EU research funding connected to Israeli AI in Gaza targeting civilians

EU research funding connected to Israeli AI in Gaza targeting civilians Palestinians living in the area inspect the area among the rubbles of destroyed buildings following an Israeli air strike on the Kemal Advan hospital in Beit Lahia in the Gaza Strip on December 29, 2024. (AA Photo)
By Anadolu Agency
Jan 8, 2025 5:09 AM

The European Union’s funding of artificial intelligence (AI) research in Israel has come under fire after reports suggest that EU-backed technologies have been used in the targeting of civilians during Israel’s ongoing military operations in Gaza.

Since the start of the Israeli attacks on Gaza on Oct. 7, 2023, the EU has provided over €238 million ($246 million) to Israeli institutions for research and innovation. This funding, critics claim, has supported the development of AI-driven systems used to identify, locate, and kill targets in Gaza.

Ethical concerns and lack of oversight

Nozomi Takahashi, a board member of the European Coordination of Committees and Associations for Palestine, confirmed that allegations of EU-funded AI technologies being used in the killings of civilians are well known.

Takahashi pointed to several AI systems, including “Habsora” (The Gospel), “Lavender,” and “Where is Daddy?” as being central to these operations. She emphasized that these systems have been used indiscriminately, targeting civilians without adequate oversight and in violation of international law.

EU research funding connected to Israeli AI in Gaza targeting civilians
People and first responders inspect the rubble of a collapsed residential building that was hit by Israeli bombardment in the Saraya area in al-Rimal in central Gaza City on January 4, 2025. (AFP Photo)

“These systems are being used to locate and kill targets in the current genocide in Gaza,” Takahashi told Anadolu. “The scale and frequency of civilians killed using such AI systems are devastating.”

Critics argue that the EU’s ethical oversight of these funding programs is inadequate. Takahashi raised concerns that the EU’s Horizon Europe program, which supports civilian projects, has inadvertently funded military applications.

She noted the difficulty of tracing specific EU-funded projects linked to the AI technologies deployed by Israel’s military, citing issues of confidentiality.

EU research funding connected to Israeli AI in Gaza targeting civilians
Palestinian children try to protect themselves from the cold weather by lighting a fire as Palestinians, who left their homes and took shelter in the Gaza Strip’s Deir al-Balah city due to Israeli attacks, are trying to provide water and food supplies to their children, while also struggling with cold weather conditions in the makeshift tents they stay in Deir al-Balah, Gaza on December 27, 2024. (AA Photo)

Takahashi also criticized the EU’s failure to adequately monitor the potential military use of its funded projects. Horizon Europe’s ethical principles call for respect for human dignity, freedom, and human rights, but Takahashi contends that the EU does not scrutinize the history of military involvement or human rights violations when reviewing project applications.

EU’s complicity in supporting military industry

Eman Abboud, a lecturer at Trinity College Dublin, echoed these concerns, stating that the EU is “culpable” for funding arms companies and security research programs that contribute to Israel’s military actions. Abboud pointed to companies like Elbit Systems and Israel Aerospace Industries, which have received EU funding under the guise of civil security research.

“Israeli organizations that profit from violent oppression and apartheid in Palestine have received funding from European programs, despite Israel facing genocide charges at the International Court of Justice,” Abboud said. “The EU has refused to sever its trade links with Israel or ban Israeli entities from Horizon Europe.”

EU-funded Israeli AI technologies deployed in civilian targeting

Israeli AI technologies, including Habsora and Lavender, have drawn specific scrutiny due to their reliance on ambiguous data and their role in targeting civilians. Lavender, which assesses the likelihood of an individual’s ties to Hamas, has flagged tens of thousands of Palestinians as “suspects,” without clear criteria or oversight. Habsora, which generates automated target lists, has been responsible for strikes on civilian infrastructure, with civilian casualties often predicted in advance.

Another AI system, “Where is Daddy?” has been used to track individuals in Gaza, bombing their homes with no regard for the presence of family members. Critics argue that these technologies frequently make errors, disregard proportionality, and contribute to widespread civilian deaths. Since the conflict began, over 45,850 Palestinians have been killed.

Takahashi and Abboud both stress the need for stronger ethical evaluations and more stringent oversight of EU-funded projects, particularly in light of their potential military applications. They warn that the use of AI in warfare, especially by a government with a documented history of human rights violations, poses significant risks to civilian lives.

Last Updated:  Jan 8, 2025 5:09 AM