Robots in the work place can perform hazardous or even 'impossible' tasks; e.g., toxic waste clean-up, desert and space exploration, and more. AI researchers are also interested in the intelligent processing involved in moving about and manipulating objects in the real world.
The companyOutrider, the pioneer in autonomous yard operations for logistics hubs, helps large enterprises improve safety and increase efficiency. The only company exclusively focused on automating all aspects of yard operations, Outrider eliminates manual tasks that are hazardous and repetitive. Outrider's mission is to drive the rapid adoption of sustainable freight transportation by deploying zero-emission systems. Outrider is a private company backed by NEA, 8VC, and other top-tier investors. For more information, visit www.outrider.ai
Long interested in the interactions between robots and humans, researchers in the Creative Machines Lab at Columbia Engineering have been working for five years to create EVA, a new autonomous robot with a soft and expressive face that responds to match the expressions of nearby humans. The research will be presented at the ICRA conference on May 30, 2021, and the robot blueprints are open-sourced on Hardware-X (April 2021). "The idea for EVA took shape a few years ago, when my students and I began to notice that the robots in our lab were staring back at us through plastic, googly eyes," said Hod Lipson, James and Sally Scapa Professor of Innovation (Mechanical Engineering) and director of the Creative Machines Lab. Lipson observed a similar trend in the grocery store, where he encountered restocking robots wearing name badges, and in one case, decked out in a cozy, hand-knit cap. "People seemed to be humanizing their robotic colleagues by giving them eyes, an identity, or a name," he said. "This made us wonder, if eyes and clothing work, why not make a robot that has a super-expressive and responsive human face?"
US safety regulators have opened 30 investigations into Tesla crashes involving 10 deaths since 2016 where an advanced driver assistance system was suspected to have been in use. The National Highway Traffic Safety Administration (NHTSA) released a list offering details about crashes under review by its special crash investigations programs. The agency, which has previously confirmed some specific Tesla crash investigations, had not previously released to Reuters a full accounting of all Tesla crashes investigated where Tesla's Autopilot system was suspected of being involved. Of the 30 Tesla crashes, NHTSA has ruled out Tesla's Autopilot in three and published reports on two of the crashes. Tesla did not immediately respond to a request for comment.
IMAGE: An artificial skin attached to a person's knee develops a purple "bruise " when hit forcefully against a metal cabinet. Credit: Adapted from ACS Applied Materials & Interfaces 2021, DOI: 10.1021/acsami.1c04911 When someone bumps their elbow against a wall, they not only feel pain but also might experience bruising. Robots and prosthetic limbs don't have these warning signs, which could lead to further injury. Now, researchers reporting in ACS Applied Materials & Interfaces have developed an artificial skin that senses force through ionic signals and also changes color from yellow to a bruise-like purple, providing a visual cue that damage has occurred.
A team of researchers from Germany's Fraunhofer FKIE institute has created a drone that can locate screaming humans. While it sounds like the stuff of dystopian fiction, it's actually something they set out to create to make it easier for first responders to find survivors following a natural disaster. "(Drones) can cover a larger area in a shorter period of time than rescuers or trained dogs on the ground," Macarena Varela, one of the lead engineers on the project, told The Washington Post. "If there's a collapsed building, it can alert and assist rescuers. It can go places they can't fly to or get to themselves."
A robotics company called Geek says it's developed a swarm of autonomous worker robots numbering in the thousands that can learn and improve the more it works, growing smarter over time. Again: The more we work them, the smarter they get. This is probably the beginning of the end. For their part, Geek says they've deployed 15,000 of their autonomous robots in about 30 countries, but the real learning happens in an autonomous warehouse in Hong Kong, the company told CNN. There, the robots track their movements by scanning QR codes on the floor as they drive over them, allowing a team of engineers to continuously refine the swarm's algorithm.
Waymo announced that it raised $2.5 billion from its parent company, Alphabet, and a dozen outside investors. It disclosed that the money is going to be used in its efforts to deploy autonomous vehicles for use on public roads. The funding round brings the total amount of capital invested into Waymo to more than $5 billion. That number does not include what has been spent by Alphabet to finance operations within the unit, which started as an internal Google project more than ten years ago. The autonomous vehicle development initiative was started by Google in 2009, with a small team of engineers sourced from its X research lab. The initiative initially operated under the codename Project Chauffeur and was eventually spun off after seven years into a standalone unit that would be called Waymo.
Ever since Riot Games offered a small tease during the Summer Games Fest, Valorant players have had a genuine reason to believe that the new agent coming to the popular free-to-play shooter will be a robot. Not one to keep fans guessing, the company today unveiled KAY/O, a "machine of war" whose mechanics are all named for code terminology and borrow a little from other popular FPSes. KAY/O is an initiator with three throwable abilities. The first is ZERO/point, a knife that when cast lodges into the first surface it hits and suppresses anyone within its blast radius. Think Revenant's Silence from Apex Legends but with a blade instead of a device.
Mazda Motor Corp. said Thursday that it plans to launch 13 electrified vehicle models globally by 2025, starting with new models next year. The new models include three electric vehicles, five gas-electric hybrids and five plug-in hybrids, the automaker said. The plan is part of Mazda's efforts to achieve its goal of electrifying all of its cars and making a quarter of them electric vehicles by 2030. Starting in 2025, Mazda will shift its focus to developing electric vehicles using a dedicated platform now under development. The company also said it will release its first autonomous car next year.
Efficiency and cost-effectiveness are the biggest challenges facing supply chain management today. Businesses are constantly striving to reduce costs, enhance profit margins, and provide exceptional customer service. In such a competitive market, disruptive technologies like Machine Learning (ML) and Artificial Intelligence (AI) have opened up exciting opportunities for companies. Are you grabbing these opportunities? Artificial Intelligence and Machine Learning have recently become buzzwords across different verticals, but what do they actually mean for modern supply chain management?