Israel Used AI 'Lavender' to Target Thousands With Little Human Verification, Report Says
Summary from the AllSides News Team
Israel has been using an AI-based program called “Lavender” to pick targets for its bombings in Gaza, causing many civilian casualties, according to reports by Israeli outlet +972 Magazine (Not Rated) and The Guardian (Lean Left bias).
The Details: Citing “six Israeli intelligence officers” who had “first-hand involvement with the use of AI to generate targets for assassination,” +972 Magazine reported that Lavender “played a central role” in the war in Gaza from its early stages, identifying as many as 37,000 Palestinians as militants. There was reportedly “no requirement to thoroughly check why the machine made those choices.” Because it was “easier to locate the individuals in their private houses,” targets were often struck “in their homes — usually at night while their whole families were present.” A previous +972 Magazine report from November cited one Israeli intelligence officer who said Israel had used a different AI, dubbed “The Gospel,” to facilitate a “mass assassination factory.”
Key Quote: The Israeli Defense Forces disputed the reports, saying it “directs its strikes only towards military targets and military operatives,” adding, “Exceptional incidents undergo thorough examinations and investigations.”
How the Media Covered It: Coverage was most common in left-rated outlets; those which had been sympathetic to the Palestinians tended to frame Israel particularly negatively, with headlines like, “‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists.” The Washington Times (Lean Right bias) and the Daily Mail (Right bias) were the only right-rated outlets found by AllSides to have published original coverage of the story.
Featured Coverage of this Story
From the Center
Israel’s reported use of AI in its Gaza war may explain thousands of civilian deathsFor years, experts have warned about the dangers of using AI in warfare. Much of the coverage of their warnings has focused on the nightmare scenario of Terminator-style autonomous weapons, but that’s not the only dystopian possibility in the technology’s battlefield deployment, as Israel has reportedly demonstrated in its war against Hamas in Gaza.
The Israeli sister publications +972 and Local Call yesterday published a lengthy account of an AI-based system called Lavender, which the Israeli Defense Forces have been using to identify targets for assassination (+972 also shared the accounts with the Guardian). Historically,...
From the Left
‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targetsThe Israeli military’s bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.
In addition to talking about their use of the AI system, called Lavender, the intelligence sources claim that Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict.
Their unusually candid testimony provides a rare glimpse into the first-hand experiences of Israeli intelligence officials who...
From the Right
Israel disputes it has Lavender, AI program for targeted killing that tolerates civilian casualtiesIsrael is aggressively disputing assertions that it is using an artificial intelligence system for a targeted killing program that tolerates civilian deaths as acceptable collateral damage in its war against Hamas.
Explosive allegations that Israel has a secret AI-powered killing machine called “Lavender” spread on Wednesday in a pair of news reports citing anonymous intelligence sources involved in the Hamas-Israel war.
The Israel Defense Forces, already under fire for the humanitarian conditions in Gaza, said Wednesday evening that it does not use AI to designate people as targets for military strikes. Israel‘s relations with...
AllSides Picks
April 23rd, 2024
April 26th, 2024