Fact or fiction? Israeli maps and AI do not save Palestinian lives
Marc Owen Jones
On December 2, the Israeli army’s Arabic-language spokesperson Avichay Adraee posted a map of Gaza, broken up into a grid of numbered blocks with instructions that Palestinians living in certain areas evacuate to Rafah. Leaflets containing a QR code linking to the map on the Israeli army’s website were also dropped over Gaza.
This move came as Israeli fighter jets bombarded the south of the Strip – previously designated as a “safe zone” – killing hundreds of Palestinians in 24 hours. The Israeli army proudly announced that it had hit “400 targets”.
Meanwhile, media reports revealed that the Israeli army’s ability to intensify what it calls “precision” air strikes has been boosted by an artificial intelligence (AI) tool that generates “targets”.
The maps, the leaflets, the tweets, the claims of “precision” military technology, all feed into the narrative that Israel’s “most moral army” is taking care to protect civilians in Gaza. But all these are no more than a propaganda ploy to cover up what really is happening on the ground – an AI-assisted genocide.
A game of maps
Over the past two months of brutal war, Israel has constantly resorted to the use of “evacuation” maps and warnings issued on social media, calling on Palestinians to flee certain areas of Gaza.
Yet the mounting death toll – nearly 16,000 people and thousands more missing and likely dead – offers no evidence that Israel is in fact concerned about the wellbeing of Palestinian civilians.
What it is concerned is about the growing condemnations abroad of what legal experts are calling genocide and increasing pressure from the United States.
Just a few days ago, US Secretary of State Antony Blinken warned Israel it has weeks, and not months, to finish its campaign in Gaza. His boss, President Joe Biden, is acutely aware of the growing domestic discontent with how he is handling the war, which could cost him votes in next year’s presidential election.
This “evacuation messaging” the Israeli army has been undertaking is more directed at Western audiences, seeking to assuage their fears about the civilian death toll, than the Palestinians in Gaza. The fact that it is delivered mostly on social media platforms indicates the intended audience is not the people in the Strip.
The Israeli army has not only cut off electricity to Gaza but also targeted and damaged its already temperamental mobile network, thus leaving most of the people there with almost no access to the internet.
The leaflets that were dropped over the weekend are also not worth the paper they were printed on. The QR code on them is of use only if there is a working phone with a charged battery and internet access.
Discrepancies of different maps being shared by Israeli officials have also resulted in additional confusion. Areas marked for targeting in orange did not even correspond with the numbers of blocks officials were telling people to evacuate from.
Consequently, the overall impact of the maps has been to create “fear, panic and confusion”, as Melanie Ward, CEO of Medical Aid for Palestinians, explained in a tweet.
Furthermore, the detailed mapping and dissection of Gaza are designed to create the illusion of precision and precaution, but the evacuation orders behind them demonstrate the opposite.
Gaza is 360 square kilometres and has a population of 2.3 million. The average size of each of the 620 blocks on the map is 0.58 square kilometres, which means approximately 3,700 residents per block.
Asking dozens of blocks equating to tens of thousands of people to move is hardly “precision”. It is mass displacement masquerading as parsimonious precaution.
Israel’s digital killing machine
Apart from using digital maps and QR codes to try and prove to its allies that its army is not reckless, Israel is also boasting about its “precision” military technologies.
Among them is an AI weapons system called “Habsora” (“The Gospel”) which can quickly and automatically identify targets, much faster than older methods.
If in previous bombing campaigns, the Israeli army would manually select 50 targets per day, today the new system provides 100.
According to one source quoted by the +972 magazine, this weapon has turned the Israeli army into a “mass assassination factory”, focusing more on the “quantity and not quality”.
The magazine reports that the Israeli soldiers using the AI targeting system are aware of the number of civilians they will kill; it is displayed in the category “collateral damage” in the target file.
The Israeli army has categorised thresholds of civilian deaths, ranging from five to the hundreds. The directive “collateral damage five”, for example, means the Israeli soldiers are authorised to kill a target that will also kill 5 civilians.
On the higher end, “the Israeli military command knowingly approved the killing of hundreds of Palestinian civilians in an attempt to assassinate a single top Hamas military commander”, +972 magazine reports.
Given that Israel considers all 30,000 Hamas members in Gaza as potential targets, this means that “wiping out” the movement would entail a massive civilian death toll. If we use the lowest “collateral damage five”, the most conservative estimate amounts to 150,000 civilians.
Of course, as Hamas leaders that are killed are inevitably replaced, hundreds more Palestinians will be murdered as the AI system generates more new targets. Since Hamas cannot be defeated militarily, the only logical outcome of this will be the perpetual murder or removal of everyone in Gaza.
Another disturbing element of AI is that it reproduces biases it has been trained on. Historically, Israel has shown little regard for civilian life in its bombing. One has to wonder to what extent the secretive AI has learned to associate any Palestinian with “Hamas terrorist” based on past Israeli army behaviour. This might explain why it is able to generate so many new “targets” for bombing.