Rarely has the world witnessed such a radical transformation in the mechanics of killing as it did during Israel’s assault on the Gaza Strip (2023–2025). The Israeli military shifted from traditional intelligence-gathering methods to artificial intelligence algorithms capable of flagging thousands of Palestinians for assassination within seconds.
This was not merely a conventional military campaign reliant on overwhelming firepower. It marked a decisive turning point, integrating artificial intelligence and big data into the very core of the kill chain.
In this report, we examine the algorithmic systems deployed by Israeli occupation forces in Gaza and how they enabled an unprecedented expansion in the scale of killing.
1. “Lavender”: The Algorithm of Death
The “Lavender” system, developed by Israel’s elite intelligence Unit 8200, served as the beating heart of human targeting during the latest offensive.
The algorithm relies on machine learning: it is trained on data from “known individuals” and then scans Gaza’s population for similar patterns to identify targets.
Operating on probabilistic logic, the system assigns each person a score from 1 to 100 based on pattern similarity such as frequently changing phones, relocating addresses, or specific communication behaviors.
According to leaked information from within the Israeli military, the system identified approximately 37,000 Palestinians as legitimate assassination targets within the first weeks of the war. Many were low-level individuals who would not have appeared on traditional targeting lists.
This included thousands of civilians employed in government or civil police roles, as well as individuals whose behavioral patterns merely resembled those of resistance members. Testimonies and leaks from Israeli intelligence officers indicate a margin of error of up to 10 percent.
These results were reportedly converted directly into kill lists adopted by military leadership without meaningful review or additional human verification. All that remained was the press of a button helping explain the widespread targeting of civil servants and their families.
2. “Where’s Daddy?”: The Eradication of Families
If “Lavender” determines who should be killed, “Where’s Daddy?” determines when and where representing a chilling apex of technological brutality.
Under conventional military doctrine, combatants are targeted while engaged in military activity or located at military facilities. But Israeli forces faced difficulties pinpointing Hamas fighters inside underground tunnels. The intelligence solution was stark and disturbing: wait for them to return home.
The “Where’s Daddy?” system was designed to track the mobile phones of individuals identified by “Lavender.” Once a target entered their home, the system would send an immediate alert to the operations officer. The underlying premise was simple: the home is the easiest location to identify and destroy regardless of family presence.
Testimonies revealed lethal technical flaws in the system. Delays frequently occurred between the target’s arrival and the airstrike. In numerous cases, homes were bombed after the intended target had already left, resulting in the killing of entire families.
One officer reportedly admitted: “We bombed the house simply because the system said the target had arrived, without verifying whether he was still inside.”
For the first time, a stark “price list” for civilian casualties was revealed:
Low-level targets: The army permitted the killing of between 15 and 20 civilians as acceptable collateral damage for assassinating one individual.
High-ranking targets: The threshold was raised to allow the killing of hundreds of civilians in exchange for one senior figure.
The bombing of Jabalia refugee camp to assassinate a battalion commander resulting in hundreds killed or wounded stands as one example.
3. “The Gospel” (Habsora): The Destruction of Civilian Life
While “Lavender” and “Where’s Daddy?” focus on people, “The Gospel” (Habsora) targets structures. This AI system analyzes satellite imagery and structural data to generate building targets at industrial speed.
“Habsora” expanded the definition of a military objective to include so-called “power targets.” These were not weapons depots or command centers, but high-rise residential towers, universities, banks, and public buildings.
The declared purpose of striking such targets was to exert immense civilian pressure on Hamas by destroying Gaza’s middle class and civilian infrastructure.
A former intelligence officer described “The Gospel” as a “mass assassination factory.” The system is reportedly capable of generating 100 targets per day, compared to roughly 50 targets per year under previous systems.
This surplus fostered what officers described as “fire hunger,” with bombardments conducted simply because targets were available often without genuine strategic evaluation of military necessity.
Technological Complicity: Even More Terrifying Tools
The evolution of the killing machine did not end there. As the assault on Gaza continued into 2025, alongside ongoing Israeli operations in the West Bank, Unit 8200 introduced even more alarming technologies.
1. A Military “ChatGPT”
A March 2025 report by The Guardian revealed that Unit 8200 developed a large language model (LLM) similar to ChatGPT, trained on vast quantities of Palestinian textual and audio data.
The system is reportedly capable of:
Precisely understanding Palestinian dialects.
Analyzing conversational context to infer speakers’ intentions.
Answering queries such as: “Who spoke angrily about the army in the Hebron area last week?”
This represents a shift from identity-based targeting (as with Lavender) to intention-based targeting where individuals could be arrested or killed merely for expressing certain ideas, even before committing any act.
2. The Role of Cloud Infrastructure
To manage these enormous volumes of data, the Israeli military relied on cloud infrastructure provided by U.S. companies including Google and Amazon under Project Nimbus, and later Microsoft, which hosted extensive Unit 8200 data.
Leaked documents from 2025–2026 revealed that Microsoft provided an air-gapped cloud environment for the Israeli military.
This enabled the storage and processing of Palestinian surveillance data on civilian corporate servers effectively making those companies technical partners in the infrastructure underpinning Israeli operations.
A Testing Laboratory and the Globalization of Crime
From within Gaza, these technologies appear as an extension of a longstanding policy that views every Palestinian as a potential threat. The deployment of “Lavender” and “Where’s Daddy?” does not reflect innocent innovation, but rather the weaponization of artificial intelligence to carry out systematic destruction.
A report by Action on Armed Violence (AOAV) warns that systematic home bombings aimed at killing individuals suspected of ties to resistance groups may amount to genocide.
United Nations experts have said these technologies are designed to “shock” the population and force displacement suggesting objectives that extend beyond military operations toward the dismantling of Palestinian society.
The Office of the High Commissioner for Human Rights condemned the use of artificial intelligence in the Gaza offensive, stating that systems like “Lavender” and “Where’s Daddy?” contributed to unprecedented destruction of homes and infrastructure and to the killing of thousands of Palestinians.
UN reports have described what is unfolding as “domicide” and “technocide,” noting that more than 80 percent of Gaza’s buildings have been destroyed following recommendations generated by “The Gospel.”
For many Palestinians, the assault on Gaza has become a testing ground for killing technologies later exported to authoritarian regimes worldwide.
Australian journalist Antony Loewenstein warned that Israel markets these systems to governments that praise rather than criticize it. Palestinian blood, he argues, becomes a commodity and suffering a profitable technological product.
Ultimately, these patterns expose the darkest face of artificial intelligence where algorithms become instruments of mass killing, heralding a new era of “algorithmic warfare” in which human beings are stripped of their humanity and reduced to data points in a system designed to harvest lives.




