Artificial intelligence used by Israel to select targets for bombing in Gaza

An investigation by +972 Magazine and Local Call, citing six Israeli artificial intelligence officials involved in the alleged program, claims that the Israeli military has been using artificial intelligence to assist in identifying bombing targets in Gaza. The officials also claim that human review of the suggested targets was at best cursory.

Table of Contents

Artificial intelligence used by Israel

According to the authorities, who were cited in-depth in a joint Israeli-Palestinian online publication inquiry, the AI-powered instrument was known as “Lavender” and had a 10% mistake rate.

Regarding +972 Magazine’s article, the Israel Defense Forces (IDF) acknowledged the tool’s existence but refuted the notion that AI is being utilized to identify potential terrorists. In a lengthy statement, however, it was made clear that information systems are only tools used by analysts to identify targets and that Israel attempts to minimize injury to civilians as much as is practical given the operational circumstances in place at the time of the hit.

artificial intelligence

According to IDF orders, analysts are required to carry out independent investigations to confirm that the targets they have identified comply with applicable definitions in line with international law and any extra restrictions.

But according to an official who spoke with +972, human staff members frequently acted as a mere “rubber stamp” for the machine’s judgments, giving each target—assuming they were male—just 20 seconds or less of attention before approving a bombing.

The inquiry is being conducted in the midst of increased international scrutiny of Israel’s military campaign following the deaths of multiple foreign humanitarian workers who were distributing food in the Palestinian enclave due to targeted airstrikes. More than 32,916 people have died as a result of Israel’s siege of Gaza, according to the Gaza Ministry of Health. A UN-backed report states that the humanitarian crisis has spiraled out of control, with nearly three-quarters of the population in northern Gaza experiencing catastrophic levels of hunger.

artificial intelligence

The author of the report, Yuval Abraham, previously spoke with CNN in January about his research into how the Israeli military has been primarily using artificial intelligence to create targets for these kinds of killings with little to no human oversight.

According to an IDF statement released on Wednesday, the Israeli military does not employ artificial intelligence to identify terrorist operatives or determine whether an individual is a terrorist. However, its analysts create up-to-date layers of information on the military operatives of terrorist organizations by cross-referencing artificial intelligence sources using a database.

According to the IDF statement, human officers are then in charge of confirming that the designated targets comply with the pertinent definitions in line with international law and any extra constraints specified in the IDF orders. This procedure is further explained by +972.

artificial intelligence

Attacks at night
Additionally, the journal said that targets were “systematically attacked” by the Israeli army in their houses, typically at night when whole families were home.

The outcome, as the sources attested, is that hundreds of Palestinians, the majority of whom were women and children or civilians not engaged in the combat, were eliminated by Israeli airstrikes as a result of the AI program’s decisions, particularly in the early weeks of the conflict, it stated.

According to the report, which cited sources, “the army preferred” to employ so-called dumb bombs, which are unguided missiles that may inflict significant damage, when accused junior militants were the target.

CNN revealed in December that dumb bombs, which can be more dangerous for people, especially in densely populated areas like Gaza, made for almost half of the 29,000 air-to-surface missiles launched on Gaza last year.

In its statement, the IDF stated that it attempts to minimize injury to civilians as much as is practical given the operational circumstances and refrains from carrying out operations if the projected collateral damage is disproportionate to the military advantage.

artificial intelligence

It also stated that the IDF examines targets prior to strikes and selects the appropriate ammunition based on humanitarian and operational factors, evaluating the target’s environment, critical infrastructure in the area, potential effects on neighboring civilians, and pertinent structural and geographical features, among other factors.

Israeli officials have long maintained that Hamas must be destroyed in order to end the conflict, which began on October 7 when its gunmen killed over 1,200 Israeli citizens and kidnapped hundreds more.

About The Author

Leave a Comment