Results


IBM and University of Melbourne present seizure prediction system

ZDNet

Researchers from IBM and the University of Melbourne have developed a proof-of-concept seizure forecasting system that predicted an average of 69 percent of seizures across 10 epilepsy patients in a dataset. The system, which the scientists claim is "fully automated, patient-specific, and tunable to an individual's needs", uses a combination of deep-learning algorithms and a low-power "brain-inspired" computing chip to predict when seizures might occur, even if patients have no previous prediction indicators. IBM noted that a one-size-fits-all approach is inadequate when it comes to epilepsy management, as the condition manifests itself uniquely in each patient. "Epilepsy is a very unique condition where triggers for seizures are specific to individual patients -- some may be sensitive to heat, others to stress. This is why deep learning is important because it can interpret the data and look for signs and patterns specific to an individual's brain signals," an IBM spokesperson told ZDNet.


An Old Technique Could Put Artificial Intelligence in Your Hearing Aid

WIRED

Dag Spicer is expecting a special package soon, but it's not a Black Friday impulse buy. The fist-sized motor, greened by corrosion, is from a historic room-sized computer intended to ape the human brain. It may also point toward artificial intelligence's future. Spicer is senior curator at the Computer History Museum in Mountain View, California. The motor in the mail is from the Mark 1 Perceptron, built by Cornell researcher Frank Rosenblatt in 1958.


How Digital Is Transforming Child Healthcare - CXOtoday.com

#artificialintelligence

Just imagine a day in your life, where you would no longer have to wait for weeks to visit your paediatrician, followed by an additional wait for your child's health test results and then still more waiting to get an accurate child health record. We are all aware that the constantly altering demographic drifts in child healthcare, the escalating child population and a steeping rise in various chronic illnesses that children these days are suffering from, have nothing but created an enormous demand for health care and social care services for children. Given the superiority of the 21st-century technology, the major question that arises is how can we modify the child health care system to better cope with the rising healthcare needs? The solution to this concern is nothing but efficiently digitalizing child health care to bring in more innovation. Electronic child healthcare is something that needs to be given immediate attention.


Using Artificial Intelligence to Rapidly Identify Brain Tumors

#artificialintelligence

At the 20th International Conference on Medical Image Computing and Computer-Assisted Intervention (MICCAI 2017), Professor Biros and collaborators presented the results of a new, automated method of characterizing gliomas. The system combines biophysical models of tumor growth with machine learning algorithms to analyze the magnetic resonance (MR) imaging data of glioma patients. The research team put their new system to the test at the Multimodal Brain Tumor Segmentation Challenge 2017 (BRaTS'17), which is a yearly competition at which research groups present new approaches and results for computer-assisted identification and classification of brain tumors using data from pre-operative MR scans. Each individual stage in the analysis pipeline utilized different TACC computing systems; the nearest neighbor machine learning classification component used 60 nodes at once (each consisting of 68 processors) on TACCs latest supercomputer Stampede2.


Exclusive: Prannoy Roy Speaks To Google CEO Sundar Pichai

#artificialintelligence

In an exclusive interview at the Googleplex in California, Google CEO Sundar Pichai spoke to NDTV on why India is important for Google, the benefits and dangers of Artificial Intelligence, privacy and data security, and Aadhaar. You know there is a bit of controversy; you know there is the most dangerous thing, World War 3 will be caused by Artificial Intelligence, what are the dangers of Artificial Intelligence? NDTV: So what is the probability or the percentage chance of World War 3 being caused by Artificial Intelligence? So getting a tool like that, in the next 10 years, in the hands of doctors, think of rural places in India where you don't have Ophthalmologists trained; and right there in those villages to help diagnose people, you know software, AI, will help any doctor diagnose people and maybe detect blindness early, it's completely curable if detected.


Google Brain chief: AI tops humans in computer vision, and healthcare will never be the same - SiliconANGLE

#artificialintelligence

But despite their training, research shows that multiple pathologists can only agree on some breast cancer diagnoses about 42 percent of the time. Released in March 2017, the study explains that Google Brain's ongoing AI medical research group trained computers to detect cancer by feeding it hundreds of pathology images and reached accuracy levels nearing 90 percent. Researchers fed an AI system 150,000 eye images and, in the end, got the AI software to exceed human accuracy levels in diagnosis, the research showed. From drug discovery to overall patient care to mapping the synaptic connections in the human brain, AI soon appears likely transform healthcare and medical treatment for millions, Dean said.


NVIDIA morphs from graphics and gaming to AI and deep learning

ZDNet

Specifically, the work required to train predictive machine learning models, especially those based on neural networks and so-called deep learning, involves analyzing large volumes of data, looking for patterns and building statistically-based heuristics. This work involves the physical inspection of industrial infrastructure, including flare stacks and heated gas plumes. Evergreen But drone-based inspection requires real-time intelligent guidance based on readings picked up by the drones' sensors (including temperatures encountered and what the drone "sees"). Here's another one: Because Avitas Systems is a GE venture, it uses GE Predix, which is a predictive analytics platform that integrates with GE Historian.


Two-year-olds should learn to code, says computing pioneer

The Guardian

Dame Stephanie Shirley, whose company was one of the first to sell software in the 1960s, said that engaging very young children – in particular girls – could ignite a passion for puzzles and problem-solving long before the "male geek" stereotype took hold. "I don't think you can start too early," she said, adding that evidence suggested that the best time to introduce children to simple coding activities was between the ages of two and seven years. "Companies run by women still have extraordinary difficulty in getting venture capital," she said. Such technology is already being tested at Priors Court in Berkshire, a residential school for autistic children that Shirley founded.


Science and Technology links (May 18th, 2017)

#artificialintelligence

And, of course, you and I cannot have access to China's Sunway TaihuLight whereas, for the right price, Google gives us access to its computing pods. However, it looks like Google is within striking distance of matching the human brain in raw computing power with a single rack of computers. Even though I am not diabetic (to my knowledge), I would love an Apple watch that monitors my blood glucose. There are countries with low fertility but high longevity (e.g., Japan) and countries with high fertility and short lives (e.g., many countries in Africa).


How Machine Learning In The Database Can Change Industries And Save Lives - ARC

#artificialintelligence

The coming era will be defined by machine and deep learning and artificial intelligence, built on top of the mobile/cloud model. Computing has moved from massive mainframe access by terminals, to databases and personal computers, to the cloud and mobile devices. As Microsoft has shown, machine learning models can be moved to the edge by bringing artificial intelligence capabilities that used to only be able to run in the cloud to the device. This is done by building some compute in to edge devices (such as CPUs and GPUs etc., as we have seen with IoT maturation) and by bringing cloud computing capabilities to the edge through virtual machines and Docker-style containerization.