Monday, April 8, 2024

2 - The AI machine directing Israel’s bombing spree in Gaza

 

There was no “zero-error” policy’

B., a senior officer who used Lavender, echoed to +972 and Local Call that in the current war, officers were not required to independently review the AI system’s assessments, in order to save time and enable the mass production of human targets without hindrances. 

“Everything was statistical, everything was neat — it was very dry,” B. said. He noted that this lack of supervision was permitted despite internal checks showing that Lavender’s calculations were considered accurate only 90 percent of the time; in other words, it was known in advance that 10 percent of the human targets slated for assassination were not members of the Hamas military wing at all.

For example, sources explained that the Lavender machine sometimes mistakenly flagged individuals who had communication patterns similar to known Hamas or PIJ operatives — including police and civil defense workers, militants’ relatives, residents who happened to have a name and nickname identical to that of an operative, and Gazans who used a device that once belonged to a Hamas operative. 

“How close does a person have to be to Hamas to be [considered by an AI machine to be] affiliated with the organization?” said one source critical of Lavender’s inaccuracy. “It’s a vague boundary. Is a person who doesn’t receive a salary from Hamas, but helps them with all sorts of things, a Hamas operative? Is someone who was in Hamas in the past, but is no longer there today, a Hamas operative? Each of these features — characteristics that a machine would flag as suspicious — is inaccurate.”

Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90)
Palestinians at the site of an Israeli airstrike in Rafah, in the southern Gaza Strip, February 24, 2024. (Abed Rahim Khatib/Flash90)

Similar problems exist with the ability of target machines to assess the phone used by an individual marked for assassination. “In war, Palestinians change phones all the time,” said the source. “People lose contact with their families, give their phone to a friend or a wife, maybe lose it. There is no way to rely 100 percent on the automatic mechanism that determines which [phone] number belongs to whom.”

According to the sources, the army knew that the minimal human supervision in place would not discover these faults. “There was no ‘zero-error’ policy. Mistakes were treated statistically,” said a source who used Lavender. “Because of the scope and magnitude, the protocol was that even if you don’t know for sure that the machine is right, you know that statistically it’s fine. So you go for it.”

“It has proven itself,” said B., the senior source. “There’s something about the statistical approach that sets you to a certain norm and standard. There has been an illogical amount of [bombings] in this operation. This is unparalleled, in my memory. And I have much more trust in a statistical mechanism than a soldier who lost a friend two days ago. Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”

Another intelligence source, who defended the reliance on the Lavender-generated kill lists of Palestinian suspects, argued that it was worth investing an intelligence officer’s time only to verify the information if the target was a senior commander in Hamas. “But when it comes to a junior militant, you don’t want to invest manpower and time in it,” he said. “In war, there is no time to incriminate every target. So you’re willing to take the margin of error of using artificial intelligence, risking collateral damage and civilians dying, and risking attacking by mistake, and to live with it.”

B. said that the reason for this automation was a constant push to generate more targets for assassination. “In a day without targets [whose feature rating was sufficient to authorize a strike], we attacked at a lower threshold. We were constantly being pressured: ‘Bring us more targets.’ They really shouted at us. We finished [killing] our targets very quickly.”

He explained that when lowering the rating threshold of Lavender, it would mark more people as targets for strikes. “At its peak, the system managed to generate 37,000 people as potential human targets,” said B. “But the numbers changed all the time, because it depends on where you set the bar of what a Hamas operative is. There were times when a Hamas operative was defined more broadly, and then the machine started bringing us all kinds of civil defense personnel, police officers, on whom it would be a shame to waste bombs. They help the Hamas government, but they don’t really endanger soldiers.”

Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90)
Palestinians at the site of a building destroyed by an Israeli airstrike in Rafah, in the southern Gaza Strip, March 18, 2024. (Abed Rahim Khatib/Flash90)

One source who worked with the military data science team that trained Lavender said that data collected from employees of the Hamas-run Internal Security Ministry, whom he does not consider to be militants, was also fed into the machine. “I was bothered by the fact that when Lavender was trained, they used the term ‘Hamas operative’ loosely, and included people who were civil defense workers in the training dataset,” he said.

The source added that even if one believes these people deserve to be killed, training the system based on their communication profiles made Lavender more likely to select civilians by mistake when its algorithms were applied to the general population. “Since it’s an automatic system that isn’t operated manually by humans, the meaning of this decision is dramatic: it means you’re including many people with a civilian communication profile as potential targets.”

‘We only checked that the target was a man’

The Israeli military flatly rejects these claims. In a statement to +972 and Local Call, the IDF Spokesperson denied using artificial intelligence to incriminate targets, saying these are merely “auxiliary tools that assist officers in the process of incrimination.” The statement went on: “In any case, an independent examination by an [intelligence] analyst is required, which verifies that the identified targets are legitimate targets for attack, in accordance with the conditions set forth in IDF directives and international law.”  

However, sources said that the only human supervision protocol in place before bombing the houses of suspected “junior” militants marked by Lavender was to conduct a single check: ensuring that the AI-selected target is male rather than female. The assumption in the army was that if the target was a woman, the machine had likely made a mistake, because there are no women among the ranks of the military wings of Hamas and PIJ.

“A human being had to [verify the target] for just a few seconds,” B. said, explaining that this became the protocol after realizing the Lavender system was “getting it right” most of the time. “At first, we did checks to ensure that the machine didn’t get confused. But at some point we relied on the automatic system, and we only checked that [the target] was a man — that was enough. It doesn’t take a long time to tell if someone has a male or a female voice.” 

To conduct the male/female check, B. claimed that in the current war, “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added value as a human, apart from being a stamp of approval. It saved a lot of time. If [the operative] came up in the automated mechanism, and I checked that he was a man, there would be permission to bomb him, subject to an examination of collateral damage.”

Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90)
Palestinians emerge from the rubble of houses destroyed in Israeli airstrikes in the city of Rafah, southern Gaza Strip, November 20, 2023. (Abed Rahim Khatib/Flash90)

In practice, sources said this meant that for civilian men marked in error by Lavender, there was no supervising mechanism in place to detect the mistake. According to B., a common error occurred “if the [Hamas] target gave [his phone] to his son, his older brother, or just a random man. That person will be bombed in his house with his family. This happened often. These were most of the mistakes caused by Lavender,” B. said.

STEP 2: LINKING TARGETS TO FAMILY HOMES

‘Most of the people you killed were women and children’

The next stage in the Israeli army’s assassination procedure is identifying where to attack the targets that Lavender generates.

In a statement to +972 and Local Call, the IDF Spokesperson claimed in response to this article that “Hamas places its operatives and military assets in the heart of the civilian population, systematically uses the civilian population as human shields, and conducts fighting from within civilian structures, including sensitive sites such as hospitals, mosques, schools and UN facilities. The IDF is bound by and acts according to international law, directing its attacks only at military targets and military operatives.” 

The six sources we spoke to echoed this to some degree, saying that Hamas’ extensive tunnel system deliberately passes under hospitals and schools; that Hamas militants use ambulances to get around; and that countless military assets have been situated near civilian buildings. The sources argued that many Israeli strikes kill civilians as a result of these tactics by Hamas — a characterization that human rights groups warn evades Israel’s onus for inflicting the casualties. 

However, in contrast to the Israeli army’s official statements, the sources explained that a major reason for the unprecedented death toll from Israel’s current bombardment is the fact that the army has systematically attacked targets in their private homes, alongside their families — in part because it was easier from an intelligence standpoint to mark family houses using automated systems.

Indeed, several sources emphasized that, as opposed to numerous cases of Hamas operatives engaging in military activity from civilian areas, in the case of systematic assassination strikes, the army routinely made the active choice to bomb suspected militants when inside civilian households from which no military activity took place. This choice, they said, was a reflection of the way Israel’s system of mass surveillance in Gaza is designed.

Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills)
Palestinians rush to bring the wounded, including many children, to Al-Shifa Hospital in Gaza City as Israeli forces continue pounding the Gaza Strip, October 11, 2023. (Mohammed Zaanoun/Activestills)

The sources told +972 and Local Call that since everyone in Gaza had a private house with which they could be associated, the army’s surveillance systems could easily and automatically “link” individuals to family houses. In order to identify the moment operatives enter their houses in real time, various additional automatic softwares have been developed. These programs track thousands of individuals simultaneously, identify when they are at home, and send an automatic alert to the targeting officer, who then marks the house for bombing. One of several of these tracking softwares, revealed here for the first time, is called “Where’s Daddy?” 

“You put hundreds [of targets] into the system and wait to see who you can kill,” said one source with knowledge of the system. “It’s called broad hunting: you copy-paste from the lists that the target system produces.”

Evidence of this policy is also clear from the data: during the first month of the war, more than half of the fatalities — 6,120 people — belonged to 1,340 families, many of which were completely wiped out while inside their homes, according to UN figures. The proportion of entire families bombed in their houses in the current war is much higher than in the 2014 Israeli operation in Gaza (which was previously Israel’s deadliest war on the Strip), further suggesting the prominence of this policy.

Another source said that each time the pace of assassinations waned, more targets were added to systems like Where’s Daddy? to locate individuals that entered their homes and could therefore be bombed. He said that the decision of who to put into the tracking systems could be made by relatively low-ranking officers in the military hierarchy. 

“One day, totally of my own accord, I added something like 1,200 new targets to the [tracking] system, because the number of attacks [we were conducting] decreased,” the source said. “That made sense to me. In retrospect, it seems like a serious decision I made. And such decisions were not made at high levels.”

The sources said that in the first two weeks of the war, “several thousand” targets were initially inputted into locating programs like Where’s Daddy?. These included all the members of Hamas’ elite special forces unit the Nukhba, all of Hamas’ anti-tank operatives, and anyone who entered Israel on October 7. But before long, the kill list was drastically expanded. 

“In the end it was everyone [marked by Lavender],” one source explained. “Tens of thousands. This happened a few weeks later, when the [Israeli] brigades entered Gaza, and there were already fewer uninvolved people [i.e. civilians] in the northern areas.” According to this source, even some minors were marked by Lavender as targets for bombing. “Normally, operatives are over the age of 17, but that was not a condition.”

Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills)
Wounded Palestinians are treated on the floor due to overcrowding at Al-Shifa Hospital, Gaza City, central Gaza Strip, October 18, 2023. (Mohammed Zaanoun/Activestills)

Lavender and systems like Where’s Daddy? were thus combined with deadly effect, killing entire families, sources said. By adding a name from the Lavender-generated lists to the Where’s Daddy? home tracking system, A. explained, the marked person would be placed under ongoing surveillance, and could be attacked as soon as they set foot in their home, collapsing the house on everyone inside.

“Let’s say you calculate [that there is one] Hamas [operative] plus 10 [civilians in the house],” A. said. “Usually, these 10 will be women and children. So absurdly, it turns out that most of the people you killed were women and children.” ... continue


No comments:

Post a Comment