New Finding

Google DeepMind's AI can now detect over 50 sight-threatening eye conditions


Tech industry insiders regularly herald AI as the solution to all of our problems, included those posed by health care. London-based DeepMind, owned by Google's parent company, Alphabet, focuses heavily on the specifics of using artificial intelligence in health care, and on Monday it released a study showing the progress it's made in using AI to diagnose eye conditions. Published in the science journal Nature, the study reports that DeepMind, in partnership with Moorfields Eye Hospital in London, has trained its algorithms to detect over 50 sight-threatening conditions to the same accuracy as expert clinicians. In a project that began two years ago, DeepMind trained its machine learning algorithms using thousands of historic and fully anonymized eye scans to identify diseases that could lead to sight loss. According to the study, the system can now do so with 94 percent accuracy, and the hope is that it could eventually be used to transform how eye exams are conducted around the world.

'Please do not switch me off!' People heed begging robot


Of the 43 people who heard Nao beg to stay online, 13 chose to listen and did not turn him off, according to the study. Some merciful participants said they felt sorry for Nao and his fear of the void. Others reported that they did not want to act against Nao's will. And while the majority of people turned Nao off despite his protests, those people hesitated to do so, waiting on average more than twice as long as people who were in tests where Nao did not make its plea. The study builds on existing research that shows humans are inclined to treat electronic media as living beings.

Don't get on a bee's bad side! They recognize faces, study finds


Scientists discovered that bees have visual processing systems just like humans, which could help us understand how facial recognition evolved. A link has been sent to your friend's email address. A link has been posted to your Facebook feed. Scientists discovered that bees have visual processing systems just like humans, which could help us understand how facial recognition evolved.

'Please do not switch me off!': An experiment with a begging robot shows people hesitate to pull the plug

Washington Post

If a little humanoid robot begged you not to shut it off, would you show compassion? In an experiment designed to investigate how people treat robots when they act like humans, many participants struggled to power down a pleading robot, either refusing to shut it off or taking more than twice the amount of time to pull the plug. The experiment was conducted by researchers in Germany whose findings were published in the scientific journal PLOS One, the Verge reported this month. Eighty-nine volunteers were asked to help improve a robot's interactions by completing two tasks with it: creating a weekly schedule and answering such questions as "Do you rather like pizza or pasta?" The tasks with the robot, named Nao, were actually part of a ploy, however.

AI identifies heat-resistant coral reefs in Indonesia


A recent scientific survey off the coast of Sulawesi Island in Indonesia suggests that some shallow water corals may be less vulnerable to global warming than previously thought. Between 2014 and 2017, the world's reefs endured the worst coral bleaching event in history, as the cyclical El Niño climate event combined with anthropogenic warming to cause unprecedented increases in water temperature. But the June survey, funded by Microsoft co-founder Paul Allen's family foundation, found the Sulawesi reefs were surprisingly healthy. In fact they were in better condition than when they were originally surveyed in 2014 - a surprise for British scientist Dr Emma Kennedy, who led the research team. "After several depressing years as a coral reef scientist, witnessing the worst-ever global coral bleaching event, it is unbelievably encouraging to experience reefs such as these," she said.

Scientists prove we are unable to learn in our sleep

Daily Mail

The ability to learn something new while you sleep, known as Hypnopedia, has long been the dream of students cramming for upcoming exams. However, a new study suggests the practice might be impossible. Experts hooked up participants to advanced brain scanners to monitor them while they slept and while they were awake. Scientists then played the participants sounds either randomly, or from part of three distinct patterns. Sleeping volunteers showed no brain activity when it came to detecting the similarities in sounds, while the participants who were awake had no trouble picking out the pattern in the recording.

Novel optics for ultrafast cameras create new possibilities for imaging

MIT News

MIT researchers have developed novel photography optics that capture images based on the timing of reflecting light inside the optics, instead of the traditional approach that relies on the arrangement of optical components. These new principles, the researchers say, open doors to new capabilities for time- or depth-sensitive cameras, which are not possible with conventional photography optics. Specifically, the researchers designed new optics for an ultrafast sensor called a streak camera that resolves images from ultrashort pulses of light. Streak cameras and other ultrafast cameras have been used to make a trillion-frame-per-second video, scan through closed books, and provide depth map of a 3-D scene, among other applications. Such cameras have relied on conventional optics, which have various design constraints.

Small group of students beats Google's machine learning code


A small team of student AI (artificial intelligence) coders outperformed codes from Google's researchers, reveal an important benchmark. Students from, a non-profit group that creates learning resources and is dedicated to making deep learning "accessible to all", have created an AI algorithm that beats code from Google's researchers. Researchers from Stanford measured the algorithm using a benchmark called DAWNBench that uses a common image classification task to track the speed of a deep-learning algorithm per dollar of compute power. According to the benchmark, the researchers found that the algorithm built by's team had beaten Google's code. consists of part-time students who are eager to try out machine learning and convert it into a career in data science.

Using your phone while doing other things makes your life more miserable, study finds

The Independent

People who check their phones while eating or spending time with their friends are less likely to enjoy themselves, a study has found. Researchers at the University of British Columbia in Canada found that mobile phone use is making people more distracted, distant and drained as a result of its pervasiveness in our modern lives. Even having a mobile phone within easy access during a meal is enough to make diners not enjoy the experience as much as those who keep their devices out of reach while they eat. The I.F.O. is fuelled by eight electric engines, which is able to push the flying object to an estimated top speed of about 120mph. The giant human-like robot bears a striking resemblance to the military robots starring in the movie'Avatar' and is claimed as a world first by its creators from a South Korean robotic company Waseda University's saxophonist robot WAS-5, developed by professor Atsuo Takanishi and Kaptain Rock playing one string light saber guitar perform jam session A man looks at an exhibit entitled'Mimus' a giant industrial robot which has been reprogrammed to interact with humans during a photocall at the new Design Museum in South Kensington, London Electrification Guru Dr. Wolfgang Ziebart talks about the electric Jaguar I-PACE concept SUV before it was unveiled before the Los Angeles Auto Show in Los Angeles, California, U.S The Jaguar I-PACE Concept car is the start of a new era for Jaguar.

Machine Learning Can Identify the Authors of Anonymous Code


Researchers who study stylometry--the statistical analysis of linguistic style--have long known that writing is a unique, individualistic process. The vocabulary you select, your syntax, and your grammatical decisions leave behind a signature. Automated tools can now accurately identify the author of a forum post for example, as long as they have adequate training data to work with. But newer research shows that stylometry can also apply to artificial language samples, like code. Software developers, it turns out, leave behind a fingerprint as well.