Well File:

AI-Alerts


New deep learning models: Fewer neurons, more intelligence

#artificialintelligence

An international research team from TU Wien (Vienna), IST Austria and MIT (USA) has developed a new artificial intelligence system based on the brains of tiny animals, such as threadworms. This novel AI-system can control a vehicle with just a few artificial neurons. The team says that system has decisive advantages over previous deep learning models: It copes much better with noisy input, and, because of its simplicity, its mode of operation can be explained in detail. It does not have to be regarded as a complex "black box," but it can be understood by humans. This new deep learning model has now been published in the journal Nature Machine Intelligence.


Tesla Autopilot Self-Driving Beta Test Will Start Next Week, Elon Musk Confirms

International Business Times

Tesla CEO Elon Musk announced via Twitter on Monday that the company's autopilot self-driving mode would be made available in a small beta test starting next week. The closed beta-test system, which will be limited to a small pool of "expert and careful drivers," will roll out next week, Car And Driver reported. The Full Self-Driving (FSD) feature has undergone a complete reboot and is expected to carry a lot of new functionality. The rewrite also updated the autopilot's labeling software to enable it to interpret the environment in 4D instead of 2D. Based on Musk's recent descriptions, the updated software will build on its current "traffic light and stop sign control" feature and will likely add turns in intersections and integrate it fully on autopilot.


Robot that can perform colonoscopies aims to make it less unpleasant

New Scientist - News

A robot that can perform colonoscopies may make the procedure simpler and less unpleasant. Pietro Valdastri at the University of Leeds in the UK and his colleagues have developed a robotic arm that uses a machine learning algorithm to move a flexible probe along the colon. The probe is a magnetic endoscope, a tube with a camera lens at the tip, that the robot controls via a magnet external to the body. The system can either work autonomously or be controlled by a human operator using a joystick, which pushes the endoscope tip further along the colon. The system also keeps track of the location and orientation of the endoscope inside the colon.


Earphone cameras watch your facial expressions and read your lips

New Scientist - News

A wearable device consisting of two mini-cameras mounted on earphones can recognise your facial expressions and read your lips, even if your mouth is covered. The tool – called C-Face – was developed by Cheng Zhang at Cornell University in Ithaca, New York, and his colleagues. It looks at the sides of the wearer's head and uses machine learning to accurately visualise facial expressions by analysing small changes in cheek contour lines. "With previous technology to reconstruct facial expression, you had to put a camera in front of you. But that brings a lot of limitations," says Zhang. "Right now, many people are wearing a face mask, and standard facial tracking will not work. Our technology still works because it doesn't rely on what your face looks like."


NXP launches AI Ethics initiative

#artificialintelligence

With secure, power-efficient edge computing and AI, everyday devices not only sense their environments, but also interpret, analyze, and act in real time on the data collected. Published in a new whitepaper entitled The Morals of Algorithms, the company details its comprehensive framework for AI principles: non-maleficence, human autonomy, explicability, continued attention & vigilance, and privacy and security by design. These principles are rooted in NXP's corporate values, ethical guidelines, and a long tradition of building some of the world's most sophisticated secure devices. The AI framework evolved as a result of a cross-company collaboration, including inputs and insights across engineering and customer-facing teams around the world. NXP is a vanguard in the AI revolution with a portfolio of microcontrollers (MCUs) and processors optimized for machine learning applications "at the edge" of networks, including thermostats, security systems, car sensors, robots and industrial automation and other devices, thereby making them not only intelligent but faster, more flexible, and more secure.


Amazon's Latest Gimmicks Are Pushing the Limits of Privacy

WIRED

At the end of September, amidst its usual flurry of fall hardware announcements, Amazon debuted two especially futuristic products within five days of each other. The first is a small autonomous surveillance drone, Ring Always Home Cam, that waits patiently inside a charging dock to eventually rise up and fly around your house, checking whether you left the stove on or investigating potential burglaries. The second is a palm recognition scanner, Amazon One, that the company is piloting at two of its grocery stores in Seattle as a mechanism for faster entry and checkout. Both products aim to make security and authentication more convenient--but for privacy-conscious consumers, they also raise red flags. Amazon's latest data-hungry innovations are not launching in a vacuum.


Deep learning enables identification and optimization of RNA-based tools for myriad applications

#artificialintelligence

DNA and RNA have been compared to "instruction manuals" containing the information needed for living "machines" to operate. But while electronic machines like computers and robots are designed from the ground up to serve a specific purpose, biological organisms are governed by a much messier, more complex set of functions that lack the predictability of binary code. Inventing new solutions to biological problems requires teasing apart seemingly intractable variables--a task that is daunting to even the most intrepid human brains. Two teams of scientists from the Wyss Institute at Harvard University and the Massachusetts Institute of Technology have devised pathways around this roadblock by going beyond human brains; they developed a set of machine learning algorithms that can analyze reams of RNA-based "toehold" sequences and predict which ones will be most effective at sensing and responding to a desired target sequence. As reported in two papers published concurrently today in Nature Communications, the algorithms could be generalizable to other problems in synthetic biology as well, and could accelerate the development of biotechnology tools to improve science and medicine and help save lives.


Going Beyond Human Brains: Deep Learning Takes On Synthetic Biology

#artificialintelligence

Work by Wyss Core Faculty member Peng Yin in collaboration with Collins and others has demonstrated that different toehold switches can be combined to compute the presence of multiple "triggers," similar to a computer's logic board. DNA and RNA have been compared to "instruction manuals" containing the information needed for living "machines" to operate. But while electronic machines like computers and robots are designed from the ground up to serve a specific purpose, biological organisms are governed by a much messier, more complex set of functions that lack the predictability of binary code. Inventing new solutions to biological problems requires teasing apart seemingly intractable variables -- a task that is daunting to even the most intrepid human brains. Two teams of scientists from the Wyss Institute at Harvard University and the Massachusetts Institute of Technology have devised pathways around this roadblock by going beyond human brains; they developed a set of machine learning algorithms that can analyze reams of RNA-based "toehold" sequences and predict which ones will be most effective at sensing and responding to a desired target sequence.


AI tool could predict how drugs will react in the body - Futurity

#artificialintelligence

You are free to share this article under the Attribution 4.0 International license. A new deep learning-based tool called Metabolic Translator may soon give researchers a better handle on how drugs in development will perform in the human body. When you take a medication, you want to know precisely what it does. Pharmaceutical companies go through extensive testing to ensure that you do. Metabolic Translator, a computational tool that predicts metabolites, the products of interactions between small molecules like drugs and enzymes could help improve the process. The new tool takes advantage of deep-learning methods and the availability of massive reaction datasets to give developers a broad picture of what a drug will do.


Live facial recognition is tracking kids suspected of being criminals

MIT Technology Review

Now a new investigation from Human Rights Watch has found that not only are children regularly added to CONARC, but the database also powers a live facial recognition system in Buenos Aires deployed by the city government. This makes the system likely the first known instance of its kind being used to hunt down kids suspected of criminal activity. "It's completely outrageous," says Hye Jung Han, a children's rights advocate at Human Rights Watch, who led the research. Buenos Aires first began trialing live facial recognition on April 24, 2019. Implemented without any public consultation, the system sparked immediate resistance.