Israel Used AI to Kill Gazan Families
By Richard C. Gross
“It is my conviction that killing under the cloak of war is nothing but an act of murder.”
Albert Einstein (1879-1955), theoretical physicist
Israel systematically and often without human intervention deployed artificial intelligence to find and kill Hamas terrorists in Gaza, disregarding previous standard military procedures and bombing their targets, usually at night, regardless whether they were in their homes with women and children present, Israeli magazines reported.
The decision to disregard civilian lives when attacking terrorists, presumably at a very senior military level, must have surprised soldiers dealing with the AI targeting systems. It’s an Israeli first, so far as is known, since there had been strict procedures to avoid killing civilians before this highly unusual and lengthy war.
If the AI machine, known as Lavender, identified a senior Hamas official such as a battalion or brigade commander, “the army on several occasions authorized the killing of more than 100 civilians in the assassination of a single commander,” the publications +972 Magazine and Local Call said Sunday. They quoted six sources in their investigation.
It’s no wonder, then, about the enormous number of Palestinians killed and the widespread rubble from flattened homes in the six months since a Hamas shock assault killed 1,200 Israelis and seized 240 hostages Oct.7. The Gaza Health Ministry, which is run by Hamas, has reported up to 33,000 Palestinian deaths since the Israeli retaliation, called “Operation Iron Swords.”
Israeli journalist and filmmaker Yuval Abraham wrote the 48 double-spaced page report for the English-language +972 Magazine, and Local Call, a sister online Hebrew-language publication. Both lean left.
The horrific application to warfare of AI, a largely experimental computer tool that experts have branded dangerous, may be a first. Generative AI is used to create new content by including text, images, sounds and animation. Israel ranks third in the world in its development.
The ramifications of the terror that this investigation of AI use unveiled and the enormity of Palestinian casualties because of AI surveillance may be overwhelming. They could force Washington to halt all weapons shipments to Israel, even those that had been ordered long ago, leading to anger and a worsening rupture in relations. It is not clear whether U.S. military officials were told of these dystopian AI tactics. Not even “Dune” dreamed this up.
Lavender was developed “to create human targets in the current war,” +972 said, and “has marked some 37,000 Palestinians as suspected ‘Hamas militants,’ most of them junior, for assassination,” An army spokesperson denied the existence of such a “kill list,” it said.
By combining Lavender with another AI system for home tracking, chillingly called “Where’s Daddy,” militants marked for killing “could be attacked as soon as they set foot in their home, collapsing the house on everyone inside” [with a bomb], the report said.
“Let’s say,” source A said, “you calculate [that there is one] Hamas [operative] plus 10 [civilians in the house]. Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children.”
Far worse than just absurd. Indefensible.
The army preferred to use dumb bombs that can destroy buildings and cause major casualties when targeting low-ranking militants, and not precision weapons, because “you don’t want to waste expensive bombs on unimportant people – it’s very expensive for the country and there’s a shortage [of these bombs],” an intelligence officer told the magazines.
“Unimportant people?”
It was permissible, the army decided during the first weeks of the war, “to kill up to 15 to 20 civilians,” two of the sources said. Previously, “the military did not authorize any ‘collateral damage’ during assassinations of low-ranking militants.”
“There was a completely permissive policy regarding the casualties of [bombing] operations – so permissive that in my opinion it had an element of revenge,” D, an intelligence source said. “The core of this was the assassinations of senior [Hamas and Palestine Islamic Jihad (PIJ)] commanders for whom they were willing to kill hundreds of civilians.”
“They” were not identified. Senior military officials, presumably. There was no indication whether Prime Minister Benjamin Netanyahu or others in his government knew of these tactics. They know now.
“. . . The army routinely made the active choice to bomb suspected militants when inside civilian households from which no military activity took place,” the report said. “This choice, they said, was a reflection of the way Israel’s system of mass surveillance in Gaza is designed.”
The army’s International Law Department never before gave such “sweeping approval” to kill noncombatants, according to source A, who was an officer in a target operation room, the report said. “It’s not just that you can kill any person who is a Hamas soldier, which is clearly permitted and legitimate in terms of international law. But they directly tell you: ‘You are allowed to kill them along with many civilians.’
“Every person who wore a Hamas uniform in the past year or two could be bombed with 20 [civilians killed as] collateral damage, even without special permission. In practice, the principle of proportionality did not exist.” Proportionality in military terms refers to civilian suffering in wartime.
“There were regulations, but they were just very lenient,” the report quoted E, another intelligence source. “We’ve killed people with collateral damage in the high double-digits, if not low-triple digits. These are things that haven’t happened before.”
“It was like that with all the junior targets,” said C, whom the expose said used automated programs during the war. “We usually carried out the attacks with dumb bombs, and that meant literally destroying the whole house on top of its occupants. But even if the attack is averted, you don’t care – you immediately move on to the next target.
“Because of the system, the targets never end. You have another 36,000 waiting.”
The collateral damage from bombing a high value target can be shocking.
For example, the report quoted another source, B, saying, when a plane bombed the commander of one battalion, Shuja’iya, Dec. 2, (whom the IDF identified as Wisam Ferhat, “we knew that we would kill over 100 civilians. For me, psychologically, it was unusual. Over 100 civilians – it crosses some red line.”
It sure does.
The report quoted a senior intelligence source as saying he believes his “disproportionate” policy of killing Palestinians in Gaza “endangers Israelis.”
“In the short term, we are safer, because we hurt Hamas,” he said. “But I think we’re less secure in the long run. I see how all the bereaved families in Gaza – which is nearly everyone – will raise the motivation for [people to join] Hamas 10 years down the line. And it will be much easier for [Hamas] to recruit them.”
It sure will. Better to bury those AI systems somewhere, especially before others go running to Israel to learn how to use them.
Richard C. Gross, who covered war and peace in Israel, the American military at the Pentagon, was foreign editor of United Press International and was the opinion page editor of The Baltimore Sun.