The Israeli army is being accused of deliberately targeting civilian homes in the weeks after Hamas' terror attack on Oct. 7, 2023 and of using an AI-based program named "Lavender" to generate targets for assassination, resulting in a large number of bombings based on decisions made with little to no human review.
The allegations emerged in a new in investigative report by +972 and Local Call that found the Israel Defense Forces system applied mass surveillance in Gaza to compile a list of 37,000 potential bombing targets.
That list included a significant number of low-level individuals who were allegedly associated with Hamas — whose terrorists killed over a thousand Israelis on Oct. 7 and took scores of hostages, more than 100 of which are belived to still be held in Gaza — even though they would not typically be the primary focus of bombing operations.
"We were not interested in killing [Hamas] operatives only when they were in a military building or engaged in a military activity," A., an intelligence officer, told +972 and Local Call.
"On the contrary, the IDF bombed them in homes without hesitation, as a first option. It's much easier to bomb a family's home. The system is built to look for them in these situations."
Israeli Military's AI Use in Gaza War
The reporting is based on interviews with six Israeli intelligence officials who were deeply involved in the use of AI to select targets. Those officials provided a potentially troubling picture of AI's involvement in the battle.
According to the officers, the Lavender AI program, developed by Israel's elite intelligence division, Unit 8200, allegedly played a pivotal role in identifying potential targets, often referred to as "junior" operatives.
The sources told +972 and Local Call that, during the first weeks of the war, the IDF almost completely relied on Lavender, which targeted as many as 37,000 Palestinians as suspected militants linked to Hamas or Palestinian Islamic Jihad — and their homes — for possible air strikes.
Additionally, the officers revealed pre-authorized allowances for civilian losses, with some indicating that up to 15 or 20 people were considered acceptable collateral damage in attacks targeting low-ranking terrorists.
These allowances were recognized by the government. According to reports, the attacks were carried out using indiscriminate weapons, which led to the destruction of whole homes and the loss of lives among civilians.
The IDF said "analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives."
But one official told +972 "that human personnel often served only as a 'rubber stamp' for the machine's decisions" and typically devoted only around 20 seconds to each target — ensuring they are male — before authorizing a bombing.
Read Also: Navy Vet Who Drove Car Into FBI Office Barrier Linked to QAnon: Report
Israel-Hamas War
The purpose that they played in the targeting process was compared by one officer to that of a simple "rubber stamp" on Lavender's judgments, with very no actual human assessment of the ideas made by the system.
Another commander voiced their preference for AI-driven targeting over human judgment, focusing on the emotional toll that the battle had taken on soldiers who had lost partners in the conflict.
Though the IDF confirmed that they use "information management tools" for target identification, they denied that they use artificial intelligence to identify terrorists. This was in response to the states that were made against them.
The IDF has said that its analysts carry out independent assessments to ensure that targets satisfy both legal and operational demands prior to the authorisation of attack. In addition, the Israeli Defense Forces (IDF) placed a high priority on conforming to international law and maintaining proportionality while evaluating the possible impact of attacks on civilian populations, The Sun reported.