Jump to content

Draft:Lavender (software)

From Wikipedia, the free encyclopedia
  • Comment: Some sections don't have any sources in them, if you can re-use the existing sources placed in other sections you are welcome to do so. If not, additional sources will need to be found. Existing sources are reliable. Encoded  Talk 💬 21:30, 14 June 2024 (UTC)
  • Comment: The title of this draft either has been disambiguated or will need to be disambiguated for acceptance.
    If the title of this draft has been disambiguated, submitters and reviewers are asked to check the disambiguated title to see if it is the most useful disambiguation, and, if necessary, rename the draft.
    If this draft is accepted, the disambiguation page will need to be edited. Either an entry will need to be added, or an entry will need to be revised. Please do not edit the disambiguation unless you are accepting this draft.
    The disambiguation page for the primary name is Lavender (disambiguation). Robert McClenon (talk) 06:30, 1 June 2024 (UTC)

Lavender is an artificial-intelligence powered tool[1] designed to track and target Hamas militants in the Gaza Strip. The existence of a "kill list" made by this program, however, has been denied by the IDF.

How it works

[edit]

Information on known Hamas militants is put into a dataset.

Using intel and surveillance from the Gaza Strip, as well as tracking phones and other devices, Hamas militants are tracked and located, and when given the green light, are targeted by the Israeli Air Force.

"Where's Daddy?" system

[edit]

The Where's Daddy system watches and tracks alleged militants within the Gaza Strip until they reach their homes, which are then targetted by Israeli forces.

"The Gospel" system

[edit]

"The Gospel" is a system under Lavender used to mark buildings used by Hamas. The Gospel has unfortunately resulted in the deaths of many civilians.

Critical reception

[edit]

Sources claim Lavender has a very permissive policy for civilian casualties due to its leniency and tracking of low-ranking Hamas or PIJ militants[2], as well as having a margin of error of 10%.[3] Critics of the program also state that dumb bombs are used when targeting people, inflicting more unnecessary casualties.[4]

References

[edit]
  1. ^ Iraqi, Amjad (2024-04-03). "'Lavender': The AI machine directing Israel's bombing spree in Gaza". +972 Magazine. Retrieved 2024-05-30.
  2. ^ Valle, Gaby Del (2024-04-04). "Report: Israel used AI to identify bombing targets in Gaza". The Verge. Retrieved 2024-05-30.
  3. ^ McKernan, Bethan; Davies, Harry (2024-04-03). "'The machine did it coldly': Israel used AI to identify 37,000 Hamas targets". The Guardian. ISSN 0261-3077. Retrieved 2024-05-30.
  4. ^ Tharoor, Ishaan (2024-04-05). "Analysis | Israel offers a glimpse into the terrifying world of military AI". Washington Post. ISSN 0190-8286. Retrieved 2024-05-30.