Goto

Collaborating Authors

Results


Predict Ads Click - Practice Data Analysis and Logistic Regression Prediction - Projects Based Learning

#artificialintelligence

In this project we will be working with a data set, indicating whether or not a particular internet user clicked on an Advertisement. We will try to create a model that will predict whether or not they will click on an ad based off the features of that user. Welcome to this project on predict Ads Click in Apache Spark Machine Learning using Databricks platform community edition server which allows you to execute your spark code, free of cost on their server just by registering through email id. In this project, we explore Apache Spark and Machine Learning on the Databricks platform. I am a firm believer that the best way to learn is by doing.


Rapid adaptation of deep learning teaches drones to survive any weather

#artificialintelligence

To be truly useful, drones--that is, autonomous flying vehicles--will need to learn to navigate real-world weather and wind conditions. Right now, drones are either flown under controlled conditions, with no wind, or are operated by humans using remote controls. Drones have been taught to fly in formation in the open skies, but those flights are usually conducted under ideal conditions and circumstances. However, for drones to autonomously perform necessary but quotidian tasks, such as delivering packages or airlifting injured drivers from a traffic accident, drones must be able to adapt to wind conditions in real time--rolling with the punches, meteorologically speaking. To face this challenge, a team of engineers from Caltech has developed Neural-Fly, a deep-learning method that can help drones cope with new and unknown wind conditions in real time just by updating a few key parameters.


Deep reinforcement learning for self-tuning laser source of dissipative solitons - Scientific Reports

#artificialintelligence

Increasing complexity of modern laser systems, mostly originated from the nonlinear dynamics of radiation, makes control of their operation more and more challenging, calling for development of new approaches in laser engineering. Machine learning methods, providing proven tools for identification, control, and data analytics of various complex systems, have been recently applied to mode-locked fiber lasers with the special focus on three key areas: self-starting, system optimization and characterization. However, the development of the machine learning algorithms for a particular laser system, while being an interesting research problem, is a demanding task requiring arduous efforts and tuning a large number of hyper-parameters in the laboratory arrangements. It is not obvious that this learning can be smoothly transferred to systems that differ from the specific laser used for the algorithm development by design or by varying environmental parameters. Here we demonstrate that a deep reinforcement learning (DRL) approach, based on trials and errors and sequential decisions, can be successfully used for control of the generation of dissipative solitons in mode-locked fiber laser system. We have shown the capability of deep Q-learning algorithm to generalize knowledge about the laser system in order to find conditions for stable pulse generation. Region of stable generation was transformed by changing the pumping power of the laser cavity, while tunable spectral filter was used as a control tool. Deep Q-learning algorithm is suited to learn the trajectory of adjusting spectral filter parameters to stable pulsed regime relying on the state of output radiation. Our results confirm the potential of deep reinforcement learning algorithm to control a nonlinear laser system with a feed-back. We also demonstrate that fiber mode-locked laser systems generating data at high speed present a fruitful photonic test-beds for various machine learning concepts based on large datasets.


'Nanomagnetic' computing can provide low-energy AI

#artificialintelligence

The new method, developed by a team led by Imperial College London researchers, could slash the energy cost of artificial intelligence (AI), which is currently doubling globally every 3.5 months. In a paper published today in Nature Nanotechnology, the international team have produced the first proof that networks of nanomagnets can be used to perform AI-like processing. The researchers showed nanomagnets can be used for'time-series prediction' tasks, such as predicting and regulating insulin levels in diabetic patients. Artificial intelligence that uses'neural networks' aims to replicate the way parts of the brain work, where neurons talk to each other to process and retain information. A lot of the maths used to power neural networks was originally invented by physicists to describe the way magnets interact, but at the time it was too difficult to use magnets directly as researchers didn't know how to put data in and get information out.


Using machine-learning to distinguish antibody targets

AIHub

The virus's spike proteins (purple) are a key antibody target, with some antibodies attaching to the top (darker purple) and others to the stem (paler zone). A new study shows that it is possible to use the genetic sequences of a person's antibodies to predict what pathogens those antibodies will target. "Our research is in a very early stage, but this proof-of-concept study shows that we can use machine learning to connect the sequence of an antibody to its function," said Nicholas Wu, a professor of biochemistry at the University of Illinois Urbana-Champaign who led the research with biochemistry PhD student Yiquan Wang; and Meng Yuan, a staff scientist at Scripps Research in La Jolla, California. With enough data, scientists should be able to predict not only the virus an antibody will attack, but which features on the pathogen the antibody binds to, Wu said. For example, an antibody may attach to different parts of the spike protein on the SARS-CoV-2 virus.


How A.I. Is Finding New Cures in Old Drugs

#artificialintelligence

In the elegant quiet of the café at the Church of Sweden, a narrow Gothic-style building in Midtown Manhattan, Daniel Cohen is taking a break from explaining genetics. He moves toward the creaky piano positioned near the front door, sits down, and plays a flowing, flawless rendition of "Over the Rainbow." If human biology is the scientific equivalent of a complicated score, Cohen has learned how to navigate it like a virtuoso. Cohen was the driving force behind Généthon, the French laboratory that in December 1993 produced the first-ever "map" of the human genome. He essentially introduced Big Data and automation to the study of genomics, as he and his team demonstrated for the first time that it was possible to use super-fast computing to speed up the processing of DNA samples.


AI Technology Can Predict Life-Threatening Heart Trouble, Researchers Say

#artificialintelligence

Researchers at Johns Hopkins University developed artificial intelligence technology that may be able to assess a patient's risk of sudden cardiac death, which is when the heart abruptly stops beating. Sometimes, modern medicine isn't enough to help keep us healthy. The Johns Hopkins University researchers said artificial intelligence can help accurately predict if and when someone's heart will stop beating years in advance. "It uses deep learning on images in combination with deep learning also on clinical data to predict the patient's risk of sudden cardiac death over a period of 10 years," said Dr. Natalia Trayanova, a professor of biomedical engineering and medicine. Trayanova's team developed the AI technology and published their work in a medical journal.


AI helps scientists design novel plastic-eating enzyme

#artificialintelligence

In brief A synthetic enzyme designed using machine-learning software can break down waste plastics in 24 hours, according to research published in Nature. Scientists at the University of Texas Austin studied the natural structure of PETase, an enzyme known to degrade polymer chains in polyethylene. Next, they trained a model to generate mutations of the enzyme that work fast at low temperatures, let the software loose, and picked from the output a variant they named FAST-PETase to synthesize. FAST stands for functional, active, stable, and tolerant. FAST-PETase, we're told, can break down plastic in as little as 24 hours at temperatures between 30 and 50 degrees Celsius.


Engineers use artificial intelligence to capture the complexity of breaking waves

#artificialintelligence

Waves break once they swell to a critical height, before cresting and crashing into a spray of droplets and bubbles. These waves can be as large as a surfer's point break and as small as a gentle ripple rolling to shore. For decades, the dynamics of how and when a wave breaks have been too complex to predict. Now, MIT engineers have found a new way to model how waves break. The team used machine learning along with data from wave-tank experiments to tweak equations that have traditionally been used to predict wave behavior.


Machine learning-aided engineering of hydrolases for PET depolymerization - Nature

#artificialintelligence

Plastic waste poses an ecological challenge1–3 and enzymatic degradation offers one, potentially green and scalable, route for polyesters waste recycling4. Poly(ethylene terephthalate) (PET) accounts for 12% of global solid waste5, and a circular carbon economy for PET is theoretically attainable through rapid enzymatic depolymerization followed by repolymerization or conversion/valorization into other products6–10. Application of PET hydrolases, however, has been hampered by their lack of robustness to pH and temperature ranges, slow reaction rates and inability to directly use untreated postconsumer plastics11. Here, we use a structure-based, machine learning algorithm to engineer a robust and active PET hydrolase. Our mutant and scaffold combination (FAST-PETase: functional, active, stable and tolerant PETase) contains five mutations compared to wild-type PETase (N233K/R224Q/S121E from prediction and D186H/R280A from scaffold) and shows superior PET-hydrolytic activity relative to both wild-type and engineered alternatives12 between 30 and 50 °C and a range of pH levels. We demonstrate that untreated, postconsumer-PET from 51 different thermoformed products can all be almost completely degraded by FAST-PETase in 1 week. FAST-PETase can also depolymerize untreated, amorphous portions of a commercial water bottle and an entire thermally pretreated water bottle at 50 ºC. Finally, we demonstrate a closed-loop PET recycling process by using FAST-PETase and resynthesizing PET from the recovered monomers. Collectively, our results demonstrate a viable route for enzymatic plastic recycling at the industrial scale. Untreated, postconsumer-PET from 51 different thermoformed products can all be almost completely degraded by FAST-PETase in 1 week and PET can be resynthesized from the recovered monomers, demonstrating recycling at the industrial scale.