UK Government


sorry-banning-killer-robots-just-isnt-practical

WIRED

That's not because it's impossible to ban weapons technologies. Some 192 nations have signed the Chemical Weapons Convention that bans chemical weapons, for example. But it hasn't suggested it would be open to international agreement banning autonomous weapons. In 2015, the UK government responded to calls for a ban on autonomous weapons by saying there was no need for one, and that existing international law was sufficient.


Elon Musk leads 116 experts calling for outright ban of killer robots

#artificialintelligence

In their letter, the founders warn the review conference of the convention on conventional weapons that this arms race threatens to usher in the "third revolution in warfare" after gunpowder and nuclear arms. This is not the first time the IJCAI, one of the world's leading AI conferences, has been used as a platform to discuss lethal autonomous weapons systems. It said that the UK was not developing lethal autonomous weapons and that all weapons employed by UK armed forces would be "under human oversight and control". The unmanned combat aerial vehicle, about the size of a BAE Hawk, the plane used by the Red Arrows, had its first test flight in 2013 and is expected to be operational some time after 2030 as part of the Royal Air Force's Future Offensive Air System, destined to replace the human-piloted Tornado GR4 warplanes.


Elon Musk and 115 robotics experts write to the UN "raising the alarm" on killer robots

#artificialintelligence

Back in 2015, more than 1,000 academics sent an open letter calling for a ban on "offensive autonomous weapons beyond meaningful human control." While the 2015 original letter calls for an outright ban on autonomous weapons, the new letter stops short, leaving a lot to the imagination. Though it's hard to think of a way of protecting us from the dangers of autonomous weapons that doesn't involve an outright ban, it's slightly odd the letter doesn't explicitly call for one. The 123 nations that make up the international convention on conventional weapons voted to formally discuss the issue of autonomous weapons last December, although the British government has previously been resistant to the idea of a complete ban, stating as recently as 2015 that "we do not see the need for a prohibition on the use of Laws, as international humanitarian law already provides sufficient regulation for this area."


Elon Musk leads 116 experts calling for outright ban of killer robots

#artificialintelligence

In their letter, the founders warn the review conference of the convention on conventional weapons that this arms race threatens to usher in the "third revolution in warfare" after gunpowder and nuclear arms. This is not the first time the IJCAI, one of the world's leading AI conferences, has been used as a platform to discuss lethal autonomous weapons systems. It said that the UK was not developing lethal autonomous weapons and that all weapons employed by UK armed forces would be "under human oversight and control". The unmanned combat aerial vehicle, about the size of a BAE Hawk, the plane used by the Red Arrows, had its first test flight in 2013 and is expected to be operational some time after 2030 as part of the Royal Air Force's Future Offensive Air System, destined to replace the human-piloted Tornado GR4 warplanes.


Elon Musk is right: we should all be worried about killer robots

#artificialintelligence

Tesla and SpaceX CEO Elon Musk, along with 115 other artificial intelligence and robotics specialists, has signed an open letter to urge the United Nations to recognize the dangers of lethal autonomous weapons and to ban their use internationally. There are already numerous weapons, like automatic anti-aircraft guns and drones, that can operate with minimal human oversight; advanced tech will eventually help them to carry out military functions entirely autonomously. To illustrate why this is a problem, consider the UK government's argument in which it opposed a ban on lethal autonomous weapons in 2015: it said that "international humanitarian law already provides sufficient regulation for this area," and that all weapons employed by UK armed forces would be "under human oversight and control." I signed the open letter because the use of AI in autonomous weapons hurts my sense of ethics, would be likely to lead to a very dangerous escalation, because it would hurt the further development of AI's good applications, and because it is a matter that needs to be handled by the international community, similarly to what has been done in the past for some other morally wrong weapons (biological, chemical, nuclear).


Elon Musk leads 116 experts calling for outright ban on killer robots

The Guardian

In their letter, the founders warn the review conference of the convention on conventional weapons that this arms race threatens to usher in the "third revolution in warfare" after gunpowder and nuclear arms. This is not the first time the IJCAI, one of the world's leading AI conferences, has been used as a platform to discuss lethal autonomous weapons systems. It said that the UK was not developing lethal autonomous weapons and that all weapons employed by UK armed forces would be "under human oversight and control". The unmanned combat aerial vehicle, about the size of a BAE Hawk, the plane used by the Red Arrows, saw its first test flight in 2013 and is expected to be operational sometime after 2030 as part of the Royal Air Force's Future Offensive Air System, destined to replace the human-piloted Tornado GR4 warplanes.


Elon Musk leads 116 experts calling for outright ban on killer robots

#artificialintelligence

In their letter, the founders warn the review conference of the Convention on Conventional Weapons that this arms race threatens to usher in the "third revolution in warfare" after gunpowder and nuclear arms. Ryan Gariepy, founder of Clearpath Robotics said: "Unlike other potential manifestations of AI which still remain in the realm of science fiction, autonomous weapons systems are on the cusp of development right now and have a very real potential to cause significant harm to innocent people along with global instability." This is not the first time the IJCAI, one of the world's leading AI conferences, has been used as a platform to discuss lethal autonomous weapons systems. The unmanned combat aerial vehicle, about the size of a BAE Hawk, the plane used by the Red Arrows, saw its first test flight in 2013 and is expected to be operational sometime after 2030 as part of the Royal Air Force's Future Offensive Air System, destined to replace the human-piloted Tornado GR4 warplanes.


The use of AI in politics is not going away anytime soon

#artificialintelligence

The next level will be using artificial intelligence in election campaigns and political life. This highly sophisticated micro-targeting operation relied on big data and machine learning to influence people's emotions. Typically disguised as ordinary human accounts, bots spread misinformation and contribute to an acrimonious political climate on sites like Twitter and Facebook. For example, if a person is interested in environment policy, an AI targeting tool could be used to help them find out what each party has to say about the environment.


Internet of incarceration: How AI could put an end to prisons as we know them - RN - ABC News (Australian Broadcasting Corporation)

#artificialintelligence

Professor Hunter's team is researching an advanced form of home detention, using artificial intelligence, machine-learning algorithms and lightweight electronic sensors to monitor convicted offenders on a 24-hour basis. "We are at the point now where we can fundamentally rethink the way in which we incarcerate people," Professor Hunter says. For criminologist and prison reform advocate Yvonne Jewkes, the Cameron reform package represented an opportunity. American reform advocate Doran Larson argues the strength of such facilities is that they offer a reintegration environment for offenders.


How artificial intelligence could put an end to prisons as we know them

#artificialintelligence

Professor Hunter's team is researching an advanced form of home detention, using artificial intelligence, machine-learning algorithms and lightweight electronic sensors to monitor convicted offenders on a 24-hour basis. "We are at the point now where we can fundamentally rethink the way in which we incarcerate people," Professor Hunter says. For criminologist and prison reform advocate Yvonne Jewkes, the Cameron reform package represented an opportunity. American reform advocate Doran Larson argues the strength of such facilities is that they offer a reintegration environment for offenders.