Activists warn UN about dangers of using AI to make life-and-death decision on the battlefield
A Nobel Peace prize winner has warned against robots making life-and-death decision on the battlefield, as it is'unethical and immoral' and can never be undone. Jody Williams made the statement at the United Nations in New York City after the US military announced its project the uses AI to make decisions on what human soldiers should target and destroy. Williams also pointed out the difficulty of holding those involved accountable for certain war crimes, as there will be a programmer, manufacturer, commander and the machine itself involved in the act. Jody Williams (right) has warned against robots making life-and-death decision on the battlefield, as it is'unethical and immoral' and'can never be undone'. She was accompanied with fellow activists Liz O'Sullivan (left) and Mary Wareham (center) Williams won the prestigious accolade in 1997 after leading efforts to ban landmines and is now an advocate with the'Campaign To Stop Killer Robots'.
Oct-22-2019, 23:25:14 GMT
- Country:
- Asia
- China (0.05)
- Middle East
- Iran (0.05)
- Israel (0.05)
- Republic of Türkiye (0.05)
- Russia (0.05)
- South Korea (0.05)
- Europe
- Russia (0.05)
- United Kingdom (0.05)
- North America > United States
- New York (0.27)
- Asia
- Industry:
- Technology:
- Information Technology > Artificial Intelligence
- Issues > Social & Ethical Issues (1.00)
- Robots (1.00)
- Information Technology > Artificial Intelligence