Scientists Develop Machine Learning Algorithms Using EMR Data to Predict Dementia -


In order to train the machine learning algorithms, researchers gathered data on patients from the Indiana Network for Patient Care. The models used information on prescriptions and diagnoses, which are structured fields, as well as medical notes, which are free text, to predict the onset of dementia. Researchers found that the free-text notes were the most valuable to help identify people at risk of developing the disease. The research team, which also included scientists from Georgia State, Albert Einstein College of Medicine and Solid Research Group, recently published its findings on two different machine learning approaches. The paper published in the Journal of the American Geriatrics Society analyzed the results of a natural language processing algorithm, which learns rules by analyzing examples, and the Artificial Intelligence in Medicine article shared the results from a random forest model, which is built using an ensemble of decision trees. Both methods showed similar accuracy at predicting the onset of dementia within one and three years of diagnosis.

Researchers Use Advanced AI to Predict Extreme Weather


Talk about the weather, generally considered a neutral topic for conversations, is about to get extremely interesting. January 2020 was the Earth's hottest January in the past 141 years of climate records according to scientists at NOAA's National Centers for Environmental Information. Globally, extreme weather and climate disasters pose a threat to public health, economic well-being, and geopolitical stability. Economically, the U.S. has incurred $1.75 trillion in losses since 1980 due to 258 weather and climate disasters according to NOAA figures. Predicting extreme weather is a complex science, and an area where artificial intelligence (AI) machine learning, specifically the pattern-recognition capabilities of deep learning, can make a difference in forecasting accuracy.

Stargazing with computers: What machine learning can teach us about the cosmos


Gazing up at the night sky in a rural area, you'll probably see the shining moon surrounded by stars. If you're lucky, you might spot the furthest thing visible with the naked eye--the Andromeda galaxy. When the Department of Energy's (DOE) Legacy Survey of Space and Time (LSST) Camera at the National Science Foundation's Vera Rubin Observatory turns on in 2022, it will take photos of 37 billion galaxies and stars over the course of a decade. The output from this huge telescope will swamp researchers with data. In those 10 years, the LSST Camera will take 2,000 photos for each patch of the Southern Sky it covers.

Artificial intelligence finds disease-related genes


It's common when using social media that the platform suggests people whom you may want to add as friends. The suggestion is based on you and the other person having common contacts, which indicates that you may know each other. In a similar manner, scientists are creating maps of biological networks based on how different proteins or genes interact with each other. The researchers behind a new study have used artificial intelligence, AI, to investigate whether it is possible to discover biological networks using deep learning, in which entities known as "artificial neural networks" are trained by experimental data. Since artificial neural networks are excellent at learning how to find patterns in enormous amounts of complex data, they are used in applications such as image recognition.

A human-machine collaboration to defend against cyberattacks


Being a cybersecurity analyst at a large company today is a bit like looking for a needle in a haystack -- if that haystack were hurtling toward you at fiber optic speed. Every day, employees and customers generate loads of data that establish a normal set of behaviors. An attacker will also generate data while using any number of techniques to infiltrate the system; the goal is to find that "needle" and stop it before it does any damage. The data-heavy nature of that task lends itself well to the number-crunching prowess of machine learning, and an influx of AI-powered systems have indeed flooded the cybersecurity market over the years. But such systems can come with their own problems, namely a never-ending stream of false positives that can make them more of a time suck than a time saver for security analysts.

Antarctica's Thwaites glacier at risk of collapse and may lead to sea levels rising by two feet

Daily Mail - Science & tech

Antarctica's Thwaites glacier has warm water from three directions well under it threatening to destroy the ice sheet and raise global sea levels by up to two feet. A team of scientists from Oregon State University made the most of ice free waters in West Antarctica to look under the glacier - which is about the size of Great Britain. Warm water from the deep ocean is welling up under the glacier from three different directions and mixing under the ice, the researchers discovered. If it collapses it could take other parts of the ice shelf with it and lead to the single largest driver of sea-level rise this century, lead researcher Erin Pettit told Nature. The £39million study involving UK and US scientists was launched after concerns the increasingly unstable glacier may have already started to collapse.

Guest speaker presents practical view of artificial intelligence


And one such unknown today is artificial intelligence. Is it right to be afraid of AI? Or is this just an irrational fear of the unknown? To make artificial intelligence more understandable to its workforce, the Air Force Research Laboratory Materials and Manufacturing Directorate recently invited Dr. Erick Brethenoux to explain how it all works and how we all can expect to benefit from it in the future. Brethenoux specializes in machine learning, artificial intelligence and applied cognitive computing on the AI team at Gartner Inc., a consulting firm AFRL information technology uses for help with its mission-critical priorities. To begin his talk, Brethenoux reassured his audience that artificial intelligence doesn't really exist.

Machine learning finds a novel antibiotic able to kill superbugs - STAT


For decades, discovering novel antibiotics meant digging through the same patch of dirt. Biologists spent countless hours screening soil-dwelling microbes for properties known to kill harmful bacteria. But as superbugs resistant to existing antibiotics have spread widely, breakthroughs were becoming as rare as new places to dig. Now, artificial intelligence is giving scientists a reason to dramatically expand their search into databases of molecules that look nothing like existing antibiotics. A study published Thursday in the journal Cell describes how researchers at the Massachusetts Institute of Technology used machine learning to identify a molecule that appears capable of countering some of the world's most formidable pathogens.

New machine learning method could supercharge battery development for electric vehicles


Battery performance can make or break the electric vehicle experience, from driving range to charging time to the lifetime of the car. Now, artificial intelligence has made dreams like recharging an EV in the time it takes to stop at a gas station a more likely reality, and could help improve other aspects of battery technology. For decades, advances in electric vehicle batteries have been limited by a major bottleneck: evaluation times. At every stage of the battery development process, new technologies must be tested for months or even years to determine how long they will last. But now, a team led by Stanford professors Stefano Ermon and William Chueh has developed a machine learning-based method that slashes these testing times by 98 percent.

The AI Show: Amazon data science head on the 3 biggest AI mistakes businesses make


Building artificial intelligence into your products, services, and processes can make you smarter, faster, and better able to compete. But building smart systems using machine learning is not like buying an accounting package or an enterprise resource planning system. That's why executives need as much training as engineers when adopting AI, said Larry Pizette, the head of of data science at Amazon's Machine Learning Solutions Lab, in the latest edition of The AI Show from VentureBeat. It's also key to understanding the major mistakes companies make when they're kicking off AI projects. "The part that I think gets missed frequently is teaching the business folks, because people always think about the data scientists and the software developers learning about these skills," said Pizette.