Today the field has progressed to the point where algorithms can recognize photos, speech and emotions, fly a drone or drive a truck, spot early signs of diabetes or cancer, and play chess and poker at a championship level. Now, in a leap that could be futuristic, absurd, or life-changing (nobody can predict which), the vision is of a robotics religion that worships an AI godhead. Anthony Levandowski, known for his contribution to driverless cars and a pioneering visionary of AI, gained wide media attention by actually forming an AI church named The Way of the Future. He is searching for adherents, and foresees an AI godhead as not ridiculous but inevitable. As he told an interviewer from Wired magazine, ""It's not a god in the sense that it makes lightning or causes hurricanes.
Doctors are developing novel solutions to make sure they come up with the right diagnoses. A flood of new initiatives by researchers, physicians, health-care systems, nonprofits and malpractice insurers is yielding new insights and approaches. These include sophisticated computer programs, some that use artificial intelligence to help analyze and diagnose tough cases, and others that scan records for errors such as missed test results and appointments. Advanced technologies aren't just bringing the processing power of big data and machine learning to bear. They are also allowing more doctors to share their knowledge--including lessons they've learned from their own diagnostic mistakes.
About a week ago, Stanford University researchers posted online a study on the latest dystopian AI: They'd made a machine learning algorithm that essentially works as gaydar. After training the algorithm with tens of thousands of photographs from a dating site, the algorithm could, for example, guess if a white man in a photograph was gay with 81 percent accuracy. They wanted to protect gay people. "[Our] findings expose a threat to the privacy and safety of gay men and women," wrote Michal Kosinski and Yilun Wang in the paper. They built the bomb so they could alert the public about its dangers.
Stanford's review board approved Kosinski and Wang's study. "The vast, vast, vast majority of what we call'big data' research does not fall under the purview of federal regulations," says Metcalf. Take a recent example: Last month, researchers affiliated with Stony Brook University and several major internet companies released a free app, a machine learning algorithm that guesses ethnicity and nationality from a name to about 80 percent accuracy. The group also went through an ethics review at the company that provided training list of names, although Metcalf says that an evaluation at a private company is the "weakest level of review that they could do."
Drone delivery is finally getting off the ground. And the action is happening in East Africa. Zipline, a pioneering drone startup that began delivering blood packs to Rwanda's remote hospitals in October 2016, today announced a major expansion into Tanzania. In early 2018 the company will begin flying its delivery drones to more than 1000 health care facilities around Tanzania, bringing urgently needed medicines and supplies to big hospitals and tiny rural clinics alike. Keller Rinaudo, founder and CEO of Zipline, says that "the richest companies in the world" are still trying to figure out how to make instant drone delivery work as a commercial service (as IEEE Spectrum has noted in it's coverage of Google's Project Wing and Amazon's Prime Air).
Last month in Rwanda, a young woman started bleeding after giving birth by C-section. Try as they might, her doctors couldn't stop it. They'd already transfused the two units of matching blood that they had on-hand. They could have called the national blood bank in the capital of Kigali to request more, but ordering it, and sending it the 25 miles over mountainous roads to the hospital would take up to four hours. The woman didn't have that kind of time.
Picture this: a patient walks into the emergency department and sits in front of the "triage nurse" -- a computer that uses advanced algorithms to ask questions based on the patient's answers. Researchers at the Massachusetts Institute of Technology (MIT) are testing robotic decision supports that schedule nursing tasks and assign rooms to patients. TAVIE uses pre-recorded videos of a nurse to coach patients to manage their health condition and make behaviour changes. Ryan Chan, an emergency nurse and a master's student, is working with Booth and his research team as they develop an online computer game to teach electronic medication administration to nursing students.
But the AI's work isn't done yet. Comparing the change in genetic code with infection rates and virulence factors could give us a better model for working toward a vaccine for this insufferable virus. And if we finally managed to program an AI that would tell us how it arrives at its conclusions, that would be a powerful collaboration indeed. Imagine an AI that evolves with the virus it tracks.