
In my prior blog, I’m asking the US and International Community leaders to follow up on the question, “How does Israel explain the proportionality of bombing legitimate HAMAS targets vs the high numbers of collateral civilian casualties by Israeli military response attacks soon after October 7, 2023, being way outside past Israeli standards?
The following is a possible explanation: (This is everyone’s worst nightmare)
Alperen Aktas published an updated article on 4/12/ 2023, “Israel using AI to identify potential civilian targets to maximize Palestinian casualties”…Report,” in which he alleges the following:
Excerpts:
“(One) reason for a higher civilian casualty rate in recent Tel Aviv attacks compared to previous similar ones is the widespread use of the (The Gospel) Habsora system, AI technology developed by the Israeli military that can automatically generate targets at a rate “beyond what was previously possible.”
“In 2019, the Israeli military established a new unit to accelerate target generation through the use of AI in its operations.”
“The +972 and Local Call research also referred to former Israeli army chief Aviv Kochavi’s in-depth interview with Ynet newspaper earlier this year, in which he mentioned that this unit is made up of “hundreds of officers and soldiers” and relies on AI capabilities.”
“Kochavi described the Habsora technology developed in this unit as a machine that, with the help of AI, processes large amounts of data better and faster than any human, transforming them into attack targets.”
“He said since its deployment in the 2021 Guardians of the Walls Operation, this machine has generated 100 new targets every day, surpassing the previous record of 50 targets generated annually in Gaza.”
“According to the +972 and Local Call research report, targets are automatically prepared and worked on according to a checklist.”
“They emphasized the fast-paced nature of the operation, comparing it to a factory where the evaluation is based on how many targets can be produced.”
“According to intelligence sources, Habsora generates automatic suggestions to attack private residences where individuals suspected of belonging to Hamas or Islamic Jihad reside. Israel then conducts extensive assassination operations by heavily bombing these homes, said the report.”
“Habsora can process “vast amounts of data that tens of thousands of intelligence officers couldn’t handle” and suggests attack targets in real time, it added.”
“When Israel launched massive attacks, many top-level Hamas officials turned to underground tunnels, allowing the use of this system to locate and target other Hamas members’ homes, the report said.”
In a July 17, 2023 Defense Post article by Joe Saballa, the following concerns were listed:
“Sources admitted to Bloomberg that the cutting-edge AI system is not subject to international or state-level regulation.”
“It also fuels concerns among military observers about the potential consequences of relying heavily on autonomous systems.”
“If there is a mistake in the calculation of the AI, and if the AI is not explainable, then who do we blame for the mistake?” Hebrew University of Jerusalem professor Tal Mimran wondered. “You can wipe out an entire family based on a mistake.”
“There are those who see AI as the next revolution in changing the face of warfare in the battlefield,” Eyal Zamir said during an annual international security forum.”
“Jerusalem has already started the process by unveiling an autonomous intelligence-gathering submarine, which has reportedly completed “thousands of hours” of operations.”
See: ‘The Gospel’: how Israel uses AI to select bombing targets in Gaza/ Guardian
IDF says it’s using AI to quickly identify and strike new targets/ Times of Israel…