Researchers have created a machine-learning system that efficiently predicts the future trajectories of multiple road users, like drivers, cyclists, and pedestrians, which could enable an autonomous vehicle to more safely navigate city streets. If a robot is going to navigate a vehicle safely through downtown Boston, it must be able to predict what nearby drivers, cyclists, and pedestrians are going to do next. A new machine-learning system may someday help driverless cars predict the next moves of nearby drivers, pedestrians, and cyclists in real-time. Humans may be one of the biggest roadblocks to fully autonomous vehicles operating on city streets. If a robot is going to navigate a vehicle safely through downtown Boston, it must be able to predict what nearby drivers, pedestrians, and cyclists are going to do next.
Instagram and Facebook users in Texas lost access to certain augmented reality filters Wednesday, following a lawsuit accusing parent company Meta of violating privacy laws. In February, Texas Attorney General Ken Paxton revealed he would sue Meta for using facial recognition in filters to collect data for commercial purposes without consent. Paxton claimed Meta was "storing millions of biometric identifiers" that included voiceprints, retina or iris scans, and hand and face geometry. Although Meta argued it does not use facial recognition technology, it has disabled its AR filters and avatars on Facebook and Instagram amid the litigation. The AR effects featured on Facebook, Messenger, Messenger Kids, and Portal will also be shut down for Texas users.
Lidar is an acronym for light detection and ranging. Lidar is like radar, except that it uses light instead of radio waves. The light source is a laser. A lidar sends out light pulses and measures the time it takes for a reflection bouncing off a remote object to return to the device. As the speed of light is a known constant, the distance to the object can be calculated from the travel time of the light pulse (Figure 1).
A new artificial intelligence system (AI) could watch and listen to your videos and label things that are happening. MIT researchers have developed a technique that teaches AI to capture actions shared between video and audio. For example, their method can understand that the act of a baby crying in a video is related to the spoken word "crying" in a sound clip. It's part of an effort to teach AI how to understand concepts that humans have no trouble learning, but that computers find hard to grasp. "The prevalent learning paradigm, supervised learning, works well when you have datasets that are well described and complete," AI expert Phil Winder told Lifewire in an email interview.
The Seoul Metropolitan Government (SMG) has announced it is building a pilot driving zone for autonomous cars. Forming part of the cooperative intelligent transport system (C-ITS) construction project, the virtual reality autonomous driving simulator will reflect road, traffic, and weather conditions by using digital twin technologies. According to SMG, by expanding the virtual territory to Gangnam and the city centre, it will enable Seoul to "leap forward" as a city of commercialised self-driving vehicles. The autonomous driving simulator will be open to the public, and anyone from companies to research institutes, start-ups, and universities can use it free of charge. SMG's rationale is the greater the numbers of developers who test the simulator the more opportunity there is to improve their technologies, and help the industry to further advance.
A plastic-degrading enzyme enhanced by amino acid changes designed by a machine-learning algorithm can depolymerise polyethylene terephthalate (PET) at least twice as fast and at lower temperatures than the next best engineered enzyme. Six years ago scientists sifting through debris of a plastic bottle recycling plant discovered a bacterium that can degrade PET. The organism has two enzymes that hydrolyse the polymer first into mono-(2-hydroxyethyl) terephthalate and then into ethylene glycol and terephthalic acid to use as an energy source. One enzyme in particular, PETase, has become the target of protein engineering efforts to make it stable at higher temperatures and boost its catalytic activity. A team around Hal Alper from the University of Texas at Austin in the US has created a PETase that can degrade 51 different PET products, including whole plastic containers and bottles.
MIT scientists have developed a machine learning model that proposes new molecules for the drug discovery process, while ensuring the molecules it suggests can actually be synthesized in a laboratory. A new artificial intelligence technique has been developed that only proposes candidate molecules that can actually be produced in a lab. Pharmaceutical companies are using artificial intelligence to streamline the process of discovering new medicines. Machine-learning models can propose new molecules that have specific properties which could fight certain diseases, accomplishing in minutes what might take humans months to achieve manually. But there's a major hurdle that holds these systems back: The models frequently suggest new molecular structures that are difficult or impossible to produce in a laboratory.
North American professional drone maker Draganfly has sent the first of nearly a dozen humanitarian drones to the non-profit Ukraine organization Revived Soldiers Ukraine (RSU) in Europe, to be used to deliver insulin to hard-to-reach hospitals in the war-torn country. RSU has ordered 200 medical response drones from Draganfly, each costing $30,000 and equipped with temperature-managed payload boxes that can transport up to 35 pounds of blood, pharmaceuticals, insulin/medicines, vaccines, and wound care kits, the drone maker said. Because insulin is a temperature-sensitive product, quick and safe transportation is a top priority. There are roughly 2.3 million people living with diabetes in Ukraine, according to the International Diabetes Association, many of whom have Type 1 diabetes and require multiple daily injections of insulin to survive. For those living in high-conflict areas of the country, access to life-saving insulin is limited or non-existent.
Arun Subramaniyan joined Intel to lead the Cloud & AI Strategy team. Arun joined Intel from AWS, where he led the global solutions team for Machine Learning, Quantum Computing, High Performance Computing (HPC), Autonomous Vehicles, and Autonomous Computing at AWS. His team was responsible for developing solutions across all areas of HPC, quantum computing, and large-scale machine learning applications, spanning $1.5B portfolio. Arun founded and grew the global teams for Autonomous Computing and Quantum Computing Go-to-market and solutions at AWS and grew the businesses 2-3x. Arun's primary areas of research focus are Bayesian methods, global optimization, probabilistic deep learning for large scale applications, and distributed computing.
Thousands of emperor penguins waddling around Antarctica have a stalker: A yellow rover tracking their every move. ECHO is a remote-controlled ground robot that silently spies on the emperor penguin colony in Atka Bay. The robot is being monitored by the Single Penguin Observation and Tracking observatory. Both the SPOT observatory, which is also remote-operated through a satellite link, and the ECHO robot capture photographs and videos of animal population in the Arctic. The research is part of the Marine Animal Remote Sensing Lab (MARE), designed to measure the health of the Antarctic marine ecosystem.