Goto

Collaborating Authors

 cluster munition


Russia-Ukraine war: List of key events, day 588

Al Jazeera

The governor of Russia's Bryansk region accused Ukraine of using cluster munitions against a Russian village near the Ukrainian border. Several houses in the village of Klimovo were damaged, although no casualties were reported. The Ukrainian Air Force said it destroyed 29 of 31 drones and one cruise missile launched by Russia, mostly towards the regions of Mykolaiv and Dnipropetrovsk, during overnight attacks that lasted more than three hours. Falling debris from destroyed Russian drones caused fires in Dnipro and in an industrial enterprise in Pavlograd, two cities in Ukraine's eastern Dnipropetrovsk region. Firefighters managed to extinguish both fires and there were not initial reports regarding victims.


Ukraine attacked Russian village with cluster munitions: Governor

Al Jazeera

The governor of Russia's Belgorod region has said that Ukraine fired cluster munitions at a village near the Ukrainian border on Friday, but that there were no casualties or damage. The governor made the statement on Saturday during a daily briefing on his Telegram channel, without providing visual evidence. There was no immediate comment from Ukrainian authorities. "In Belgorod district, 21 artillery shells and three cluster munitions from a multiple-launch rocket system were fired at the village of Zhuravlevka," Governor Vyacheslav Gladkov said. Ukraine received cluster bombs from the United States this month, but it has pledged to use them only to dislodge concentrations of enemy soldiers. They contain dozens of small bomblets that rain shrapnel over a wide area, but are banned in many countries due to the potential danger they pose to civilians.


Artificial Intelligence and Arms Control

Scharre, Paul, Lamberth, Megan

arXiv.org Artificial Intelligence

Potential advancements in artificial intelligence (AI) could have profound implications for how countries research and develop weapons systems, and how militaries deploy those systems on the battlefield. The idea of AI-enabled military systems has motivated some activists to call for restrictions or bans on some weapon systems, while others have argued that AI may be too diffuse to control. This paper argues that while a ban on all military applications of AI is likely infeasible, there may be specific cases where arms control is possible. Throughout history, the international community has attempted to ban or regulate weapons or military systems for a variety of reasons. This paper analyzes both successes and failures and offers several criteria that seem to influence why arms control works in some cases and not others. We argue that success or failure depends on the desirability (i.e., a weapon's military value versus its perceived horribleness) and feasibility (i.e., sociopolitical factors that influence its success) of arms control. Based on these criteria, and the historical record of past attempts at arms control, we analyze the potential for AI arms control in the future and offer recommendations for what policymakers can do today.


AI helps scour video archives for evidence of human-rights abuses

#artificialintelligence

THANKS ESPECIALLY to ubiquitous camera-phones, today's wars have been filmed more than any in history. Consider the growing archives of Mnemonic, a Berlin charity that preserves video that purports to document war crimes and other violations of human rights. If played nonstop, Mnemonic's collection of video from Syria's decade-long war would run until 2061. Mnemonic also holds seemingly bottomless archives of video from conflicts in Sudan and Yemen. Even greater amounts of potentially relevant additional footage await review online.


AI Emerges as Crucial Tool for Groups Seeking Justice for Syria War Crimes

WSJ.com: WSJD - Technology

So as the United Nations, European authorities and human-rights groups build war-crimes cases, they have turned to a novel tool: artificial intelligence. With the regime of President Bashar al-Assad emerging largely victorious from nearly a decade of conflict, efforts to bring about some measure of accountability are gaining speed, largely in European courts. Since the beginning of Syria's conflict, activists on the ground risked their lives to document human-rights violations, from torture and attacks on protesters to indiscriminate rocket strikes and barrel bombs. Now, AI and machine learning could play an integral role in bringing war criminals to justice for Syria by helping to sort through the huge trove of evidence, and serve as a model for investigations into other modern-day conflicts. "You have a use of technology both to disseminate the information, capture it, and now to search it that is suddenly very different and changes the way you work," said Catherine Marchi-Uhel, who heads the United Nations body tasked with collecting Syrian evidence and building cases.


Sandia's Robots Pull Apart Warheads to Recycle Thousands of Micro-Grenades

IEEE Spectrum Robotics

The United States builds a lot of weapons. Unless a lot of really bad stuff happens all at once, we build more weapons than we can possibly use, and since we keep inventing new ones that are better and doing what weapons do, all the old stuff tends to just pile up. These piles of old explosives aren't aging particularly well, leaving us with few options, which include forgetting about them for longer than is probably safe, or blowing them up. A third option is disassembly and recycling, but that's dangerous for humans, because these weapons can be very old, and very lethal. Sandia National Labs has been helping the Department of Defense deal with some of its stockpile of M26 rockets, which are packed full of tiny little grenades and need to be taken apart very carefully.


'Killer robots': AI experts call for boycott over lab at South Korea university

The Guardian

Artificial intelligence researchers from nearly 30 countries are boycotting a South Korean university over concerns a new lab in partnership with a leading defence company could lead to "killer robots". More than 50 leading academics signed the letter calling for a boycott of Korea Advanced Institute of Science and Technology (KAIST) and its partner, defence manufacturer Hanwha Systems. The researchers said they would not collaborate with the university or host visitors from KAIST over fears it sought to "accelerate the arms race to develop" autonomous weapons. "There are plenty of great things you can do with AI that save lives, including in a military context, but to openly declare the goal is to develop autonomous weapons and have a partner like this sparks huge concern," said Toby Walsh, the organiser of the boycott and a professor at the University of New South Wales. "This is a very respected university partnering with a very ethically dubious partner that continues to violate international norms."