Yet with attempt after attempt, Brett improves, learning by trial and error how to eventually nail the execution. A programmer could keep tweaking Brett's algorithm to get it to learn ever faster, sure. "And you might have a reinforcement learning algorithm that maybe can have a robot learn to walk in a few hours rather than two weeks, maybe even faster." Without robots learning to learn, humans will have to hold their hands.
Someone will have to propose (or, at least, accept when an algorithm proposes) an explicit, unambiguous rule for when to pull the lever, push the heavy man, or swerve into the café. Attorneys have even invented an adage to make this abrogation seem responsible: "Hard cases make bad law," it's said. In truth, the lawyers-will-save-us argument has the direction of causality backwards: The impact of the law will not be felt upon the trolley problem; rather, the impact of the trolley problem, and its solution, will be felt upon the law -- for example, in how juries are instructed to determine whether someone behaved reasonably. Hard cases don't make bad law, they make bad jurists, ones who are afraid to admit that their reasoning is often driven by selfishness, sentimentality, or social pressures.
The latest of these comes from Silicon Valley startup Matternet, which has been testing an autonomous drone network over Switzerland, shuttling blood and other medical samples between hospitals and testing facilities. Right now, Raptopoulos says, hospitals move those supplies using third party couriers that tend to be expensive and unreliable, or even use taxis. To make that possible, the California company developed a drone base station that automates ground operations, to make life as easy as possible for operators. Another Silicon Valley startup, Zipline, delivers blood and vaccine supplies throughout Rwanda and Tanzania, African countries where lacking infrastructure makes flying much more efficient than driving.
But because I am visiting Nest, and Mittleman is its Head of Product Design, working on a new gadget that this startup-turned-controversial Alphabet division is launching, I can't say I am surprised. The new products include the aforementioned Nest Secure, a home security system; Nest Hello, an internet-connected doorbell; an outdoor version of its Nest Cam IQ security camera (which uses Google face recognition to identify people who wander into range); and, perhaps most significant, the integration of the voice-based Google Assistant into Nest products, beginning with the indoor IQ camera. When Fadell left in June 2016, Larry Page replaced him with Marwan Fawaz, a nuts-and-bolts guy who is much in the mold of other recent Alphabet division leaders: experienced, middle-aged guys (always guys) known less for vision than for delivering quarterly results. Though no case like that had ever been reported, Nest halted sales for a few weeks and issued a recall to disable the feature in 440,000 Protect units.
Today, we're on the verge of another revolution, as artificial intelligence and machine learning turn the graphic design field on its head again. These kinds of automated tools will arrive on the web first, but print design will change, too, as design-software makers inject machine learning into their layout tools and apps. Wix, another popular website builder, also offers an AI solution: Wix ADI (Artificial Design Intelligence). These web design tools might offer assistance using artificial intelligence, machine learning, and algorithms, but on the whole, they still require hands-on use.
The brainchild of Ludovic Huraux, a French entrepreneur best known for building a popular French dating site, Attractive World, Shapr is an app that helps you meet new professional connections. Every day Shapr pairs each of its users with 15 new connections, selected via algorithm through matching interests or career success. Shapr adapted its algorithm to include interests (both personal and professional) as added by a user, and career level as determined by a combination of human moderators and a machine learning algorithm. Though many networking groups, such as Meetup, have succeeded by siloing users by interest, Huraux believes deeply that connecting with people outside of your normal interest groups is the key to professional success.
For months now, major companies have been hooking up--Uber and Daimler, Lyft and General Motors, Microsoft and Volvo--but Intel CEO Brian Krzanich's announcement on Monday that the giant chipmaker is helping Waymo, Google's self-driving car project, build robocar technology registers as some seriously juicy gossip. Krzanich said Monday that Waymo's newest self-driving Chrysler Pacificas, delivered last December, use Intel technology to process what's going on around them and make safe decisions in real time. And last year, Google announced it had created its own specialized chip that could help AVs recognize common driving situations and react efficiently and safely. "Our self-driving cars require the highest-performance compute to make safe driving decisions in real-time," Waymo CEO John Krafcik said in a statement.
On Monday, Cisco's Talos security research division revealed that hackers sabotaged the ultra-popular, free computer-cleanup tool CCleaner for at least the last month, inserting a backdoor into updates to the application that landed in millions of personal computers. Three times in the last three months, hackers have exploited the digital supply chain to plant tainted code that hides in software companies' own systems of installation and updates, hijacking those trusted channels to stealthily spread their malicious code. Even Artificial Neural Networks Can Have Exploitable'Backdoors' According to Avast, the tainted version of the CCleaner app had been installed 2.27 million times from when the software was first sabotaged in August until last week, when a beta version of a Cisco network monitoring tool discovered the rogue app acting suspiciously on a customer's network. One month later, researchers at Russian security firm Kaspersky discovered another supply chain attack they called "Shadowpad:" Hackers had smuggled a backdoor capable of downloading malware into hundreds of banks, energy, and drug companies via corrupted software distributed by the South Korea-based firm Netsarang, which sells enterprise and network management tools.
That's why Karray's team created a prototype system that uses cameras--both Microsoft Kinect cameras and simple dashcams, mounted in a variety of locations on a simulated dashboard--to detect hand movements and algorithms to then grade them on how likely they are to put the driver in danger. Cadillac's Super Cruise system, for example, tracks the human's head position with an infrared camera. They created that algorithm with end-to-end deep learning, training the computer with a large number of images--hand positions, head placement--that involve known distracted-driving scenarios. "Unlike pattern recognition-based algorithms, deep neural networks learn from the huge number of samples presented to them to build their capabilities," says Karray, who conducted the research with Waterloo's Arief Koesdwiady, Chaojie Ou, and Safaa Bedawi.
Stanford's review board approved Kosinski and Wang's study. "The vast, vast, vast majority of what we call'big data' research does not fall under the purview of federal regulations," says Metcalf. Take a recent example: Last month, researchers affiliated with Stony Brook University and several major internet companies released a free app, a machine learning algorithm that guesses ethnicity and nationality from a name to about 80 percent accuracy. The group also went through an ethics review at the company that provided training list of names, although Metcalf says that an evaluation at a private company is the "weakest level of review that they could do."