As the economy begins the slow process of re-opening, advanced technologies such as artificial intelligence, machine learning, and natural language processing are playing a key role not only in monitoring COVID-19 outbreaks, but how companies manage the unchartered landscape before them. Use of AI during the earliest days of the pandemic centered on tracking the spread of the disease around the world. Today, AI is playing a critical role in how pharmaceutical and biotech companies research and test treatments, and in the development of a vaccine. And now, as states begin to reopen, and businesses try to find the best path forward, these advanced technologies are enabling them to figure out how to do this safely and effectively. "Companies don't have historical data to work from because they've never dealt with a crisis like this before," said Katie Stein, chief strategy officer for Genpact, a global professional services firm that specializes in digital transformation.
According to the World Health Organization, more than one billion people worldwide have disabilities. The field of disability studies defines disability through a social lens; people are disabled to the extent that society creates accessibility barriers. AI technologies offer the possibility of removing many accessibility barriers; for example, computer vision might help people who are blind better sense the visual world, speech recognition and translation technologies might offer real-time captioning for people who are hard of hearing, and new robotic systems might augment the capabilities of people with limited mobility. Considering the needs of users with disabilities can help technologists identify high-impact challenges whose solutions can advance the state of AI for all users; however, ethical challenges such as inclusivity, bias, privacy, error, expectation setting, simulated data, and social acceptability must be considered. The inclusivity of AI systems refers to whether they are effective for diverse user populations.
Machine vision coupled with artificial intelligence (AI) has made great strides toward letting computers understand images. Thanks to deep learning, which processes information in a way analogous to the human brain, machine vision is doing everything from keeping self-driving cars on the right track to improving cancer diagnosis by examining biopsy slides or x-ray images. Now some researchers are going beyond what the human eye or a camera lens can see, using machine learning to watch what people are doing on the other side of a wall. The technique relies on low-power radio frequency (RF) signals, which reflect off living tissue and metal but pass easily through wooden or plaster interior walls. AI can decipher those signals, not only to detect the presence of people, but also to see how they are moving, and even to predict the activity they are engaged in, from talking on a phone to brushing their teeth.
Despite the rapid advances it has made it over the past decade, deep learning presents many industrial users with problems when they try to implement the technology, issues that the Internet giants have worked around through brute force. "The challenge that today's systems face is the amount of data they need for training," says Tim Ensor, head of artificial intelligence (AI) at U.K.-based technology company Cambridge Consultants. "On top of that, it needs to be structured data." Most of the commercial applications and algorithm benchmarks used to test deep neural networks (DNNs) consume copious quantities of labeled data; for example, images or pieces of text that have already been tagged in some way by a human to indicate what the sample represents. The Internet giants, who have collected the most data for use in training deep learning systems, have often resorted to crowdsourcing measures such as asking people to prove they are human during logins by identifying objects in a collection of images, or simply buying manual labor through services such as Amazon's Mechanical Turk.
Brandon Moak felt as if a freight train had hit him. It was mid-March, and the cofounder and CTO of the autonomous- trucking startup Embark Trucks had been keeping tabs on the emergence of covid-19. As a shelter-in-place order went into effect throughout the San Francisco Bay Area, where Embark is based, Moak and his team were forced to ground almost all their 13 self-driving semi-trucks (a few stayed on the road moving essential freight but weren't in autonomous mode) and send home the majority of their workforce, with no idea how long it'd be before they could return. For safety reasons, autonomous vehicles typically have two operators apiece. That's a no-go in the age of social distancing, and leaders of autonomous-vehicle companies knew they'd have to mothball their fleets.
The whistleblower who exposed in 2019 that Apple contractors listened to users' Siri recordings without their knowledge or consent has gone public to protest the lack of action taken against the technology giant. In a letter, sent to all European data protection regulators, Thomas le Bonniec said that Apple had conducted a "massive violation of the privacy of millions of citizens." He wrote that although news of the case had already gone public, the technology giant "has not been subject to any kind of investigation to the best of my knowledge." Mr Le Bonniec, who was hired by one of Apple's subcontractors in Ireland called Globe Technical Services, had to listen to recordings from users and correct transcription errors. Listening to hundreds of recortings from Apple's iPhones, iPads, and Apple Watches, many of them were taken "outside of any activation of Siri" – meaning that users were not aware of the action.
Apple's latest iPhone software update, iOS 13.5, released Wednesday, is there for you. Your eyes, nose and mouth must be visible for Face ID, Apple's facial recognition software, to recognize you. But with the coronavirus, device owners may be wearing masks when out in public. So Apple is making it easier for you to unlock your phone when you have a mask on. Install the update and you will no longer have to wait for Face ID to fail several times before being prompted to enter your passcode.
This article is part of Privacy in the Pandemic, a Future Tense series. Since the pandemic began, authorities in New Delhi, Italy, Oman, Connecticut, and China have begun to experiment with fever-finding drones as a means of mass COVID-19 screening. They're claiming the aircraft can be used to better understand the health of the population at large and even to identify potentially sick individuals, who can then be pulled aside for further diagnostic testing. In Italy, police forces are reportedly using drones to read the temperatures of people who are out and about during quarantine, while officials in India are hoping to use thermal-scanner-equipped drones to search for "temperature anomalies" in people on the ground. A Lithuanian drone pilot even used a thermal-scanning drone to read the temperature of a sick friend who didn't own a thermometer.
Tiny drug-carrying robots that can move against the direction of blood flow could one day be used to deliver chemotherapy drugs directly to cancer cells. Metin Sitti at the Max Planck Institute for Intelligent Systems in Stuttgart, Germany and his colleagues have developed tiny robots called "microrollers" that can carry cancer drugs and selectively target human breast cancer cells. The team drew inspiration for design of the robots from white blood cells in the human body, which can move along the walls of blood vessels against the direction of blood flow. The microrollers are made from glass microparticles and are spherical in shape. One half of the robot was coated with a thin magnetic nanofilm made from nickel and gold.