Israeli army used controversial 'Lavender' AI system to create 'kill list' of Palestinian militants and bomb 37,000 targets, report claims

Daily Mail - Science & tech 

The Israeli army has been using an AI system to populate its'kill list' of alleged Hamas terrorists, leading to the deaths of women and children, a new report claims. The report cited six Israeli intelligence officers, who admitted to using an AI called'Lavender' to classify as many as 37,000 Palestinians as suspected militants -- marking these people and their homes as acceptable targets for air strikes. Israel has vehemently denied the AI's role with an army spokesperson describing the system as'auxiliary tools that assist officers in the process of incrimination.' Lavender was trained on data from Israeli intelligence's decades-long surveillance of Palestinian populations, using the digital footprints of known militants as a model for what signal to look for in the noise, according to the report. The intel sources noted that human officers scanned each AI-chosen target for about '20 seconds' before giving their'stamp' of approval, despite an internal study that had determined Lavender AI misidentified people 10 percent of the time. Israel quietly delegated the identification of Hamas terrorists, Palestinian civilians and aide workers to an artificial intelligence, 'Lavender,' a new report revealed.