Thank you dear subscribers, we are overwhelmed with your response.
Your Turn is a unique section from ThePrint featuring points of view from its subscribers. If you are a subscriber, have a point of view, please send it to us. If not, do subscribe here: https://theprint.in/
According to a Reuters report of January 24, Hamas had recruited between 10,000 and 15,000 members since the start of its war with Israel. On January 14, then US Secretary of State Antony Blinken said “the United States believed Hamas had recruited almost as many fighters as it had lost in the Palestinian enclave”, cautioning that this was a “recipe for an enduring insurgency and perpetual war.”
Amongst several aspects responsible for a sort of blowback manifesting in terms of resurgence in numbers of Hamas cadres – target profiling carried out by the Israel Defence Forces (IDF) through Artificial Intelligence (AI), has been a major attributable factor. An investigative report based on the testimony from the IDF officers involved in employing AI systems to identify terrorist targets in Gaza, was published by the Israeli-Palestinian publication +972 Magazine.
In the military operations conducted by the IDF before 7/10, ascertaining human targets was a deliberate and rigorous process that involved discussions and endorsement by a legal adviser. After 7/10, the methodology was fast-tracked, for several reasons. To meet this demand, the IDF came to rely heavily on the AI system called Lavender to generate a database of personnel based on established and imagined characteristics of a Hamas cadre.
As has been the case with most of the AI systems employed for human surveillance, the data set to train Lavender had elements of bias, leading to inaccuracies. Lavender was believed to achieve a 90 percent accuracy rate and ten percent inaccuracies were accepted as collateral damage.
As per the +972 report, Lavender created a database of tens of thousands of suspected low ranking Hamas cadres. This was used alongside another AI-based system, called the Gospel. The system recommended buildings and structures as targets rather than individuals. These buildings were basically the private homes of suspected individuals. IDF preferred going for Hamas cadres when they were in their respective homes, with families. It was much easier to bomb a family’s home and the system called “where’s Daddy?” was trained to look for targets in such situations. The fatalities of large numbers of women and children in Gaza was a fall-out of the said targeting methodology adopted by the IDF.
With the progress and intensification of bombardment since 7/10, the targeting process undertaken by the IDF were relaxed. As mentioned in the +972 report, “there was a completely permissive policy regarding the casualties of bombing operations and that it appeared to be having an element of revenge.” In any case, Lavender was all about profiling junior operatives and low ranking targets, who as per the IDF did not warrant much deliberations during the decision process for engagement. Time at the disposal was just not sufficient to analyse every target. The approval was readily accorded to adopt Lavender’s database for engaging targets, with no requirement to cross check or to examine any other type of intelligence data for corroboration.
Earlier mentioned ten percent inaccuracy of the system was primarily due to the system characteristic to occasionally mark individuals, often based on merely a loose connection to militant groups, or no connection at all. It may be added that ten percent of the designated targets were not members of the Hamas military wing at all- was a constant that was taken for granted. As explained in +972 report, the Lavender system sometimes mistakenly flagged individuals based on conjectures, arising from the biased data set that it possessed. Similar communication pattern, police and civil defence workers, militants’ relatives, namesakes and Gazans who used a device that once belonged to a Hamas operative – all came within the scope of Lavender.
In nutshell, the increased acceptance of disproportionate collateral damage by the IDF in their bid to eliminate Hamas has resulted in killing of more than 46,000 individuals. The surviving members of every single family, irrespective of their allegiance to Hamas military wing or otherwise, remain wounded – physically and psychologically. These individuals joining the ranks of Hamas, in absence of any alternative, should not surprise the international community. The regained numbers by Hamas, points at the blowback phenomenon normally associated with the bad blood of terrorism and its after-effects. International humanitarian law, also known as the law of armed conflict, which is aimed at limiting the ill-effects of armed conflict warrants an expansion in scope to cover contemporary technologies. Global consensus on ethical military application of emerging and disruptive technologies like AI is indeed an idea whose time has come. It is hoped that such an aspiration is not too ambitious in the transactional times that we live in.
These pieces are being published as they have been received – they have not been edited/fact-checked by ThePrint.