Israel menggunakan kecerdasan buatan untuk mengidentifikasi 37.000 target Hamas.

Israeli military’s bombing campaign in Gaza utilized an undisclosed AI-powered database, code-named Lavender, to identify potential targets associated with Hamas, according to intelligence sources involved in the conflict. These sources, who have firsthand experience using the AI system, revealed that the Israeli military allowed for a significant number of Palestinian civilian casualties, especially during the early stages of the war.

In exclusive testimonies shared with the Guardian ahead of their publication in +972 Magazine and Local Call, six intelligence officers described how Lavender played a central role in rapidly identifying potential targets linked to Hamas and Palestinian Islamic Jihad (PIJ). At one point, the AI system reportedly listed 37,000 Palestinian men as potential targets.

Developed by Unit 8200, the Israel Defense Forces’ elite intelligence division, Lavender raised questions about the legality and morality of using advanced AI systems in warfare. Some intelligence officers expressed doubts about the necessity of human involvement in the target selection process, with one officer stating that they felt like a mere “stamp of approval” rather than adding any value to the decision-making process.

The testimonies also shed light on the IDF’s targeting processes and the use of pre-authorized allowances for civilian casualties during airstrikes. According to the sources, dumb bombs were often used to target low-ranking militants, resulting in the destruction of entire homes and the deaths of civilians.

The testimonies revealed a shift in the IDF’s approach to targeting individuals, with an emphasis on targeting Palestinian men linked to Hamas’s military wing, regardless of their rank or importance. The sources described a permissive policy regarding civilian casualties during the conflict, with one source suggesting that there was an element of revenge in the targeting process.

MEMBACA  WhatsApp Menambahkan Filter Pesan untuk Membuat Penggunaannya Lebih Mudah

Overall, the testimonies provide insight into the use of AI systems in modern warfare and raise important questions about the ethical implications of relying on machine-learning algorithms to identify and target enemy combatants. As the conflict in Gaza continues, the role of AI in military operations is likely to remain a topic of debate and scrutiny.