AI in the hands of humans without humanity

An investigation by the Israeli magazine 972, in collaboration with the news website Local Call, based on the testimony of six intelligence officers with responsibilities for the selection of targets in the current war in Gaza, has revealed the use by the Israeli army of Lavender, a software based on artificial intelligence (AI).

Oliver Thansan
Oliver Thansan
06 April 2024 Saturday 04:26
8 Reads
AI in the hands of humans without humanity

An investigation by the Israeli magazine 972, in collaboration with the news website Local Call, based on the testimony of six intelligence officers with responsibilities for the selection of targets in the current war in Gaza, has revealed the use by the Israeli army of Lavender, a software based on artificial intelligence (AI). Lavender is used to identify members of Hamas and Islamic Jihad and include them on a list of potential targets for bombing, including low-profile targets.

These same sources confirm to 972 that Lavender was of primary importance in the initial stages of the war; His influence on military operations reached such a level that Lavender's targeting was treated as if it were “a human decision,” meaning that the military gave broad approval for officers to adopt target lists drawn up by him. Lavender, without any requirement to thoroughly check why the machine selected the targets or examine the raw intelligence data on which they were based. In fact, according to one of the sources, the only check that a human operator made was to ensure that the selected target was a man and to do so he only had a maximum of 20 seconds before authorizing the attack. This “routine check” was done because, according to the developers of Lavender, this software has a nominal success rate of 90%, which means that one in ten times it identifies targets that have nothing to do with either Hamas or Islamic Jihad. To add more horror to horror, sources confirm that another variable when attacking a target is cost/benefit: if it is a high-profile Hamas or Islamic Jihad, it is attacked with a guided missile – also called “intelligent” – which supposedly causes less collateral damage, while if it is someone with a low profile, the missile is not guided, but “dumb” – in military jargon – which, being much less precise , causes more collateral damage. “You should not waste expensive bombs on unimportant people; It is very expensive for the country and there is a shortage [of these bombs],” one of the intelligence officers told 972.

Collateral damage takes into account the estimated proportion of civilian casualties for each of the selected targets. This proportion has varied depending on the stage of the war, and has come to allow 100 civilian victims for each high command target and up to 20 in the case of troop targets (that is, low profile). In this sense, an anonymous Israeli intelligence officer states: “We were not interested in killing targets only when they were in a military building or engaged in military activity. On the contrary, the Israel Defense Forces bombed the houses where they lived without hesitation, as a first option, and preferably at night, because they were more likely to be at home. “It is much easier to bomb houses and the system is created to look for them in these situations.” To follow the targets home, automated systems are also used, one of them has the terrible name of Where's daddy? ('Where's daddy?')

We cannot say that we were not warned. In 2016, AI researchers, Nobel laureates and other prominent members of the scientific community signed a letter warning of the risks of using AI in the military field. One of the main risks is the rate of false positives, that is, the selection of innocent civilian targets. During the first weeks of the war, the military relied almost entirely on Lavender, and this software identified up to 37,000 Palestinians as targets for possible attacks. If Lavender's 10% error is correct, we can estimate that she included about 4,000 civilians on the target list. Unfortunately, what we are seeing shows us that the concern of the signatories of the letter, which warned of the risks of using AI for military purposes, was legitimate: the 972 investigation shows that the worst omens have been fulfilled.

In 2021, an author with the enigmatic name of Brigadier General Y. S. already spoke, without naming him, of a system similar to Lavender in his book The human-machine team: how to create a synergy between human and artificial intelligence that will revolutionize our world. Although it may seem very generic, the book talks about how the application of AI in war can solve the problems of target selection and the bottleneck that its approval represents. 972's investigation has also revealed that Brigadier Y. S. is the current commander of the elite intelligence unit 8200 of the Israeli army. Brigadier General Y.S. says in his book: “A machine can use big data to generate information better than humans. However, a machine cannot understand context, has no feelings or ethics, and cannot think outside the box. Therefore, instead of prioritizing between humans and machines, we would have to create the human-machine team, which will combine human intelligence and artificial intelligence and create supercognition.” The problem in this case is that, in Israel, the team of people who have decided to design, program and apply Lavender are humans without humanity.