Cybersecurity researchers revealed on Thursday a newfound vulnerability in an app that controls the world's most popular consumer drones, threatening to intensify the growing tensions between China and the United States. In two reports, the researchers contended that an app on Google's Android operating system that powers drones made by China-based Da Jiang Innovations, or DJI, collects large amounts of personal information that could be exploited by the Beijing government. The world's largest maker of commercial drones, DJI has found itself increasingly in the cross hairs of the United States government, as have other successful Chinese companies. The Pentagon has banned the use of its drones, and in January the Interior Department decided to continue grounding its fleet of the company's drones over security fears. DJI said the decision was about politics, not software vulnerabilities.
These being pandemic times, a recent visit to the Silicon Valley offices of drone startup Skydio involved slipping past dumpsters into the deserted yard behind the company's loading dock. Moments later, a black quadcopter eased out of the large open door sounding like a large and determined wasp. Skydio is best known for its "selfie drones," which use onboard artificial intelligence to automatically follow and film a person, whether they're running through a forest or backcountry skiing. The most recent model, released last fall, costs $999. The larger and more severe-looking machine that greeted WIRED has similar autonomous flying skills but aims to expand the startup's technology beyond selfies into business and government work, including the military.
Stéphane Fymat, the head of that new business, said Honeywell expects the hardware and software market for urban air taxis, drone cargo delivery, and other drone businesses to reach $120 billion by 2030 and Honeywell's market opportunity would be about 20% of that. He declined to say how much of that market Honeywell was targeting to capture, adding only that the unit has hundreds of employees with many engineers. Honeywell doesn't build drones itself but provides autonomous flight controls systems and aviation electronics. The new business creation comes as the coronavirus pandemic creates a surge of interest in drone deliveries; Fymat said it's accelerating the drone cargo delivery programs of some of its partners. Some of Honeywell's customers include Intel-backed Volocopter, Slovenia-based small aircraft maker Pipistrel, which is developing an electric vertical take-off and landing aircraft for cargo delivery, and UK-based Vertical Aerospace, which has test flown a prototype vehicle last year that can carry 250 kilograms and fly at 80 kilometers an hour.
Air power has played an increasingly important role in the Libyan conflict. The relatively flat featureless desert terrain of the north and coast means that ground units are easily spotted, with few places to hide. The air forces of both the United Nations-recognised Government of National Accord (GNA) and eastern-based commander Khalifa Haftar and his self-styled Libyan National Army (LNA) use French and Soviet-era fighter jets, antiquated and poorly maintained. While manned fighter aircraft have been used, for the most part the air war has been fought by unmanned aerial vehicles (UAVs) or drones. With nearly 1,000 air strikes conducted by UAVs, UN Special Representative to Libya Ghassan Salame called the conflict "the largest drone war in the world".
Drone firm Zipline has been given the go-ahead to deliver medical supplies and personal protective equipment to hospitals in North Carolina. The firm will be allowed to use drones on two specified routes after the Federal Aviation Administration granted it an emergency waiver. It is the first time the FAA has allowed beyond-line-of-sight drone deliveries in the US. Experts say the pandemic could help ease some drone-flight regulations. Zipline, which has been negotiating with the FAA, wants to expand to other hospitals and eventually offer deliveries to people's homes.
This article is part of Privacy in the Pandemic, a Future Tense series. Since the pandemic began, authorities in New Delhi, Italy, Oman, Connecticut, and China have begun to experiment with fever-finding drones as a means of mass COVID-19 screening. They're claiming the aircraft can be used to better understand the health of the population at large and even to identify potentially sick individuals, who can then be pulled aside for further diagnostic testing. In Italy, police forces are reportedly using drones to read the temperatures of people who are out and about during quarantine, while officials in India are hoping to use thermal-scanner-equipped drones to search for "temperature anomalies" in people on the ground. A Lithuanian drone pilot even used a thermal-scanning drone to read the temperature of a sick friend who didn't own a thermometer.
As countries around the world are gradually reopening following lockdowns, government authorities are using surveillance drones in an attempt to enforce social distancing rules. In India, police are using AI-equipped drones developed by US start-up Skylark Labs to monitor evening curfews and the distance between people who are outside during the day. The drones are being flown in six cities in the northern state of Punjab, and are also being trialled in the southern city of Bangalore, says Skylark Labs CEO Amarjot Singh. Each drone is fitted with a camera and an AI that can detect humans within a range of 150 metres to 1 kilometre. If it spots people it can send an alert to police in the district located nearest to the sighting.
A drone company that had to abandon its fast-food delivery tests has partnered with Ireland's health authority to deliver prescriptions instead. Manna Aero is working with the Health Service Executive to deliver medicines and other essential supplies to vulnerable people in the small rural town of Moneygall. The company's trial uses autonomous drones made in Wales. And it is looking at the possibility of testing in the UK within weeks. The UK has already announced a test of drones to carry supplies to the Isle of Wight during the pandemic.
This past fall, diplomats from around the globe gathered in Geneva to do something about killer robots. In a result that surprised nobody, they failed. The formal debate over lethal autonomous weapons systems--machines that can select and fire at targets on their own--began in earnest about half a decade ago under the Convention on Certain Conventional Weapons, the international community's principal mechanism for banning systems and devices deemed too hellish for use in war. But despite yearly meetings, the CCW has yet to agree what "lethal autonomous weapons" even are, let alone set a blueprint for how to rein them in. Meanwhile, the technology is advancing ferociously; militaries aren't going to wait for delegates to pin down the exact meaning of slippery terms such as "meaningful human control" before sending advanced warbots to battle.
NAIROBI (Thomson Reuters Foundation) - Countries are rapidly developing "killer robots" - machines with artificial intelligence (AI) that independently kill - but are moving at a snail's pace on agreeing global rules over their use in future wars, warn technology and human rights experts. From drones and missiles to tanks and submarines, semi-autonomous weapons systems have been used for decades to eliminate targets in modern day warfare - but they all have human supervision. Nations such as the United States, Russia and Israel are now investing in developing lethal autonomous weapons systems (LAWS) which can identify, target, and kill a person all on their own - but to date there are no international laws governing their use. "Some kind of human control is necessary ... Only humans can make context-specific judgements of distinction, proportionality and precautions in combat," said Peter Maurer, President of the International Committee of the Red Cross (ICRC).