In medicine, false positives are expensive, scary, and even painful. Yes, the doctor eventually tells you that the follow-up biopsy after that bloop on the mammogram puts you in the clear. But the intervening weeks are excruciating. A false negative is no better: "Go home, you're fine, those headaches are nothing to worry about." The problem with avoiding both false positives and negatives, though, is that the more you do to get away from one, the closer you get to the other.
Everyone working in the autonomous vehicle space said it was inevitable. In America--and in the rest of the world--cars kill people, around 40,000 in the US and 1.25 million in the globe each year. Self-driving cars would be better. But no one promised perfection. Still, the death of Elaine Herzberg, struck by a self-driving Uber in Tempe, Arizona, two weeks ago, felt like a shock.
In 1899, the world's most powerful nations signed a treaty at The Hague that banned military use of aircraft, fearing the emerging technology's destructive power. Five years later the moratorium was allowed to expire, and before long aircraft were helping to enable the slaughter of World War I. "Some technologies are so powerful as to be irresistible," says Greg Allen, a fellow at the Center for New American Security, a non-partisan Washington DC think tank. "Militaries around the world have essentially come to the same conclusion with respect to artificial intelligence." Allen is coauthor of a 132-page new report on the effect of artificial intelligence on national security. One of its conclusions is that the impact of technologies such as autonomous robots on war and international relations could rival that of nuclear weapons.
As her fellow patients read dog-eared magazines or swipe through Instagram, Shari Forrest opens an app on her phone and gets busy training artificial intelligence. She writes textbooks for a living. But when the 54-year-old from suburban St. Louis needs a break or has a free moment, she logs on to Mighty AI, and whiles away her time identifying pedestrians and trash cans and other things you don't want driverless cars running into. "If I am sitting waiting for a doctor's appointment and I can make a few pennies, that's not a bad deal," she says. The work is a pleasant distraction for Forrest, but absolutely essential to the coming ages of driverless cars.
Ask not what the government can do for Silicon Valley; ask what Silicon Valley can do for the government. He presented WIRED with six challenges he feels the tech industry needs to address--just a few earthshaking problems the country could use some help with, that's all. We reached out to six of the biggest names in the WIRED world, and we gave each of them a challenge from the president's list. Then we asked: To get this done, what's the industry's best play? Silicon Valley runs on stories. So does the economy in general. We create what we believe in.
The way you drive is surprisingly unique. And in an era when automobiles have become data-harvesting, multi-ton mobile computers, the data collected by your car--or one you rent or borrow--can probably identify you based on that driving style after as little as a few minutes behind the wheel. In a study they plan to present at the Privacy Enhancing Technology Symposium in Germany this July, a group of researchers from the University of Washington and the University of California at San Diego found that they could "fingerprint" drivers based only on data they collected from internal computer network of the vehicle their test subjects were driving, what's known as a car's CAN bus. In fact, they found that the data collected from a car's brake pedal alone could let them correctly distinguish the correct driver out of 15 individuals about nine times out of ten, after just 15 minutes of driving. With 90 minutes driving data or monitoring more car components, they could pick out the correct driver fully 100 percent of the time.