Jump to content

Draft:Lavender (artificial intelligence)

From Wikipedia, the free encyclopedia
  • Comment: The Guardian article almost entirely is what those affiliated say so is a primary source. 972 is a little better but not enough to establish notability. Also, both were published on the same day which does not support sustained coverage and most of the content is unsourced. S0091 (talk) 17:03, 13 May 2024 (UTC)

Lavender is a human targeting AI system developed by the Israel Defense Forces' (IDF) elite intelligence division Unit 8200. First hypothesized in 2021 in a book titled "The Human-Machine Team: How to Create Synergy Between Human and Artificial Intelligence That Will Revolutionize Our World," Lavender is the first known implementation of its kind to actively employ artificial intelligence in identifying and approving military bombing targets.[1][2]

First deployed during the 2021 Israel–Palestine crisis, Lavender has since been relied on by the IDF at least during the first few weeks of the Israel-Hamas War. To date, Lavender is responsible for potentially ordering the bombing of at least 37,000 suspected militant targets in Gaza, comparable to a local health ministry's estimation that 33,000 Palestinians had been killed since the beginning of the conflict six months prior.

While the known role of Lavender is to broadly identify human targets in a manner similar to a hit list, Lavender works in conjunction with two other AI systems playing an active role in identifying targets in the war: Where's Daddy, which tracks the given human targets in order to signal an airstrike after they enter their home, and The Gospel, which targets militant-occupied buildings rather than the militants themselves.[1]

Operation

[edit]

As an AI system, Lavender operates as a typical classification machine learning system. It is trained on data of known Hamas and Palestinian Islamic Jihad (PIJ) operatives, which allows it to potentially classify the likelihood that a given human is an operative. This produces a likelihood ranking of humans based on similarity to the training data, which is then passed to intelligence officers for human evaluation. Approved targets are then tracked by Where's Daddy, and an airstrike is called once the target enters their home.

According to testimony from six intelligence officers from the IDF, the human evaluation step was not observed in practice during the early stages of the Israel-Hamas War.[1] After checking that the target is male, IDF officers typically trusted Lavender's statistical judgements in order to save time while absolving personal responsibility by treating false positives as a statistical error.

Although it is estimated that 90% of the predictions made by Lavender are valid military targets by the IDF's definition, the remaining 10% might include people with similar names as Hamas operatives, or people who communicate in the same style as them.[2] This has led to an unusually large proportion of civilians being targeted by Lavender, including aid workers.

See also

[edit]
  • Lethal autonomous weapon – Autonomous military technology system
  • Israel-Hamas war – Ongoing armed conflict in the Middle East
  • Unit 8200 – Intelligence unit of the Israel Defense Forces

References

[edit]
  1. ^ a b c Abraham, Yuval (3 April 2024). "'Lavender': The AI machine directing Israel's bombing spree in Gaza". 972 Magazine. Retrieved 12 May 2024.
  2. ^ a b McKernan, Bethan (3 April 2024). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. Retrieved 12 May 2024.