Much like Daniel Hale, our recipient of the inaugural Ellsberg Whistleblower Award, anticipated, the means and deployment of (semi-)automated and autonomous weapon systems have considerably increased since his whistleblowing in 2014. Technological advancements in artificial intelligence have drastically impacted warfare. In April 2024, the well-known Israeli journalist Yuval Abraham and The Guardian published investigations into the use of an artificial intelligence called ‘Lavender’ to identify potential Hamas terrorists as targets for bombing. The testimonies of six whistleblowers from the Israel Defense Forces (IDF) suggest that the selections generated by Lavender were subject to alarmingly low levels of human control and thus not verified, and that high numbers of “collateral” victims were knowingly tolerated.
Continue reading the statement of Ninon Colneric, former judge at the European Court of Justice (ECJ).
Annegret Falter
Chair of Whistleblower Network
Statement of Ninon Colneric on Lavender and the Use of AI in the Gaza War
October 17th, 2024
The country code for Israel is +972. In the English +972 Magazine, named after this country code, an article by Yuval Abraham was published on April 3rd, 2024, which spread like wildfire around the world. Within 48 hours, it was picked up by The Guardian, The Washington Post, Le Monde, Haaretz, CNN, Business Insider and The Independent, for example. The BBC, CNN, Channel 4 and Democracy Now interviewed the author.[1] After a certain delay, even the Hamburger Abendblatt devoted a major article to Yuval Abraham’s report.[2]
What was it about? Yuval Abraham himself summarized his reportage as follows:
„Our new investigation reveals that the Israeli army has developed an artificial intelligence-based program called “Lavender,” which has been used to mark tens of thousands of Palestinians as suspected militants for potential assassination during the current Gaza war.
According to six whistleblowers interviewed for the article, Lavender has played a central role in the Israeli army’s unprecedented bombing of Palestinians since October 7, especially during the first weeks of the war. In fact, according to the sources, the army gave sweeping approval for soldiers to adopt Lavender’s kill lists with little oversight, and to treat the outputs of the AI machine “as if it were a human decision.” While the machine was designed to mark “low level” military operatives, it was known to make what were considered identification “errors” in roughly 10 percent of cases. This was accompanied by a systematic preference to strike Lavender-marked targets while they were in their family homes, along with an extremely permissive policy toward casualties, which led to the killings of entire Palestinian families.“ [3]
The article appeared under the headline “‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza”.[4] Parallel to the English- publication, there was a publication in Hebrew on the news website Local Call. On November 30th, 2023, Yuval Abraham had already published a report on the AI system Habsora (“Gospel”), which marks buildings and facilities as targets at high speed and was used in Gaza for non-military targets.[5]
In their response to the statements made by the six whistleblowers with regards to Lavender, the Israel Defense Forces stated: „The „system“ your questions refer to is not a system, but simply a database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations. This is not a list of confirmed military operatives eligible to attack.“ [6]
The following can be said about this:
In 2021, a book was published with the title “The Human-Machine Team – How to Create Synergy Between Human & Artificial Intelligence That Will Revolutionize Our World”. The author is listed as “Brigadier General Y.S.”. He is the commander Yoav of the Center for Artificial Intelligence of Unit 8200, meaning the elite unit of the intelligence service of the Israeli armed forces. In his book, the commander complains that “humans are the bottleneck that prevent the creation of tens of thousands of targets in context. […] It does not matter how many people you have tasked to produce targets during the war – you still cannot produce enough targets per day.” [7] He sees the solution in a “targets machine” based on artificial intelligence and machine learning algorithms. [8] He describes in detail a system similar to Lavender. Yoav wrote the book during a one-year study visit at the National Defense University in Washington. [9] The US had already used a database system in Iraq and Afghanistan that used numbers from captured cell phones to generate lists of targets for nighttime attacks and airstrikes. The use of human intelligence to confirm identities was explicitly ruled out. [10] A publication by the Israeli military reveals that artificial intelligence was already being used in Israel in 2021 as part of Operation “Guardian of the Walls” to define target persons en masse. [11]
The author of the Lavender article, Yuval Abraham, is no stranger to Berlin. He received the award for best documentary film at the Berlinale 2024 together with the Palestinian Basel Adra for the film “No other Land”. In his acceptance speech, he described the unequal legal treatment of Israeli citizens and Palestinians in the territories occupied by Israel as “apartheid”. He was therefore accused of anti-Semitism by German politicians and the media. On X, Abraham Yuval reported that his family was then beset by a right-wing Israeli mob looking for him and had to flee in the middle of the night. He himself had received death threats. In response to the accusations of anti-Semitism, Yuval Abraham wrote: “The appalling misuse of this word by Germans, not only to silence Palestinian critics of Israel, but also to silence Israelis like me who support a ceasefire that will end the killing in Gaza and allow the release of the Israeli hostages – empties the word anti-Semitism of meaning and thus endangers Jews all over the world.” [12] These events are presumably the reason why the whistleblower network was unable to make contact with Yuval Abraham to get him to attend an event in Berlin.
The six whistleblowers on whose statements the Lavender article is based have remained anonymous for good reason. The case of Mordechai Vanunu shows how dangerous it can be to make Israeli military secrets public. In 1986, Vanunu revealed the extent of Israel’s nuclear weapons program. While in London, he was lured to Rome by a secret agent, drugged there, taken to Israel on board an Israeli freighter and sentenced to 18 years in prison in camera. He spent 10 years in solitary confinement in a cell measuring 3 by 2 meters. [13] A traitor from the Israeli point of view, Vanunu was awarded the Right Livelihood Award, the so-called Alternative Nobel Prize, in 1987. [14]
What do we know about the six whistleblowers who were Abraham’s sources? According to Abraham, they were Israeli intelligence officers who had all served in the army during the current war in Gaza and were directly involved in the use of AI to generate targets. One of them, who held a high rank, stated that in retrospect he believed that the disproportionate policy of killing Palestinians in Gaza also endangered Israelis. This was one of the reasons why he had decided to be interviewed. Another had felt remorse when it turned out that the person targeted by the bombing of a house had already moved elsewhere with her family at the time of the bombing and that there were two other families in the house with their children. In an interview with Democracy Now, Abraham explained that his informants had been under the impression of the horrors of October 7 when they were called in. But they gradually realized, although not all of them, that they too were expected to commit atrocities. [15]
According to the Israeli army, its procedural regulations require that it must be verified for each target whether it meets the relevant definitions of international law and additional restrictions contained in the army’s directives. An individual assessment of the predicted advantage and expected collateral damage must also be carried out. According to the whistleblowers’ detailed statements, however, this was the practice after October 7, especially in the first few weeks:
Unlike in previous attacks on Gaza, the army had decided to target all members of Hamas’ military branch regardless of their rank or military importance. Four sources said that some 37,000 Palestinians had been marked for killing as suspected Hamas militants, mostly low-ranking individuals. Intelligence officials tested Lavender’s targeting accuracy for about two weeks and concluded that the system’s identification was accurate in 90% of cases. As a result, the Army allowed Lavender’s identification of individuals to be treated essentially like an order. One source explained that before the bombing was authorized, only about 20 seconds were spent per target to ensure it was a man.
The target persons were systematically attacked when they were at home with their families. Special software programs were used to determine that exact point in time. One of these programs was called “Where’s Daddy?”. Mostly so-called dumb bombs, which are cheaper than precision bombs, were used. This resulted in the entire house collapsing on top of the occupants. There were guidelines for the number of civilian casualties allowed per fighter. For low-ranking fighters, this was initially up to 20 or – according to another source – up to 15 people. The number was later lowered and then raised again. In the case of high-ranking fighters, over 100 civilian casualties were considered acceptable. Whereas in previous Gaza wars, after an attack it was checked whether the target had been killed and how many civilians died with him, in the current war this routine investigation was abandoned in the case of low-ranking fighters.
If I look at the statements of the six whistleblowers as if I had to make a judicial assessment of evidence, I see a number of indications that speak for their credibility, e.g.: quantitative richness of detail, description of self-psychological processes, the unusual nature of the program name “Where’s Daddy?” and the manifold interweaving of the content of the statements with changing external circumstances. The credibility of the statements is also supported by the result of the attacks on Gaza in the first weeks after October 7th: unprecedented destruction and extremely high numbers of victims within a very short period of time.
Last but not least, the almost completely unrestricted warfare described by the whistleblowers fits the social context of this war. The Israeli army had been humiliated by the events of October 7th and will have been anxious to restore its reputation very quickly through images of victory. The language used by senior representatives of the Israeli government in relation to the Palestinians was dehumanizing. The Israeli defense minister, for example, declared in a speech to Israeli forces on the Gaza border on October 10: “I have released all restraints. […] We are fighting human animals.” [16] Untrue atrocity stories such as the one about the 40 beheaded babies[17] further inflamed the agitated mood caused by the horrors of 7 October. The statements of the six whistleblowers appear plausible against this background.[18]
Yuval Abraham associated his report on Lavender with the hope that what had been so clearly set out in it would help people all over the world to demand a ceasefire, the release of the hostages, an end to the occupation and a political solution. However, he looked beyond Gaza. As he explained in his interview with Democracy Now[19], he sees the use of artificial intelligence in warfare as a danger to humanity. In his opinion, this type of warfare makes it possible to evade accountability. International law is in crisis. The use of artificial intelligence in war will deepen this crisis.
That is how it is.
[1] https://www.972mag.com/edition/yuval-lavender-newsletter/.
[2] Hamburger Abendblatt, April 26th, 2024, p. 4.
[3] https://www.972mag.com/edition/yuval-lavender-newsletter/
[4] https://www.972mag.com/lavender-ai-israeli-army-gaza/
[5] https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/
[6] https://www.theguardian.com/world/2024/apr/03/israel-defence-forces-response-to-claims-about-use-of-lavender-ai-database-in-gaza#:~:text=Contrary%20to%20claims%2C%20the%20IDF,in%20the%20target%20identification%20process.
[7] The Human-Machine Team, p. 51 f.
[8] ibid, pp. 70-72.
[9] ibid, pp. 22.
[10] Medea Benjamin and Nicolas J.S. Davies: A Brief History of Kill Lists, From Langley to Lavender, https://www.codepink.org/aiisrael .
[11] https://www.israeldefense.co.il/en/node/57246
[12]https://x.com/yuval_abraham/status/1762558886207209838?ref_src=twsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E1762558886207209838%7Ctwgr%5Ea121f8d7ddf04c9dcd55bdd560a036cd32e8b1fc%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Fwww.spiegel.de%2Fkultur%2Fkino%2Fyuval-abraham-berichtet-nach-berlinale-gala-von-morddrohungen-und-wehrt-sich-gegen-antisemitismus-vorwuerfe-a-f32f90d6-afa3-4c8b-a66b-10b012887224 , post of February 27th, 2024
[13] https://rightlivelihood.org/the-change-makers/find-a-laureate/mordechai-vanunu/, , https://www.faz.net/aktuell/feuilleton/israel-die-angst-vor-vanunu-1161556.html .
[14] https://rightlivelihood.org/the-change-makers/find-a-laureate/mordechai-vanunu/ .
[15] https://www.youtube.com/watch?v=4RmNJH4UN3s&list=PLHz5QS2YdKXKD5Q0gEGvnFYSlv98-bpwy&index=66
[16] Decision of the International Court of Justice, January 26th, 2024 in the case South Africa against Israel, Rn. 52.
[17] https://www.lemonde.fr/en/les-decodeurs/article/2024/04/03/40-beheaded-babies-the-itinerary-of-a-rumor-at-the-heart-of-the-information-battle-between-israel-and-hamas_6667274_8.html
[18] Also see the assessment of military expert Andreas Krieg, https://www.youtube.com/watch?v=ydiWRyDys0U , starting Min. 7:20.
[19] https://www.youtube.com/watch?v=4RmNJH4UN3s&list=PLHz5QS2YdKXKD5Q0gEGvnFYSlv98-bpwy&index=66