Robots in the work place can perform hazardous or even 'impossible' tasks; e.g., toxic waste clean-up, desert and space exploration, and more. AI researchers are also interested in the intelligent processing involved in moving about and manipulating objects in the real world.
A small fleet of Cruise robotaxis in San Francisco suddenly stopped operating on Tuesday night, effectively stopping traffic on a street in the city's Fillmore district for a couple of hours until employees were able to arrive. TechCrunch first noticed a Reddit post that featured a photo of the stalled driverless cabs at the corner of Gough and Fulton streets. Cruise -- which is General Motor's AV subsidiary -- only launched its commercial robotaxi service in the city last week. The rides feature no human safety driver, are geo-restricted to certain streets and can only operate in the late evening hours. Cruise apologized for the incident in a statement, but gave little explanation for what caused the mishap.
A crewless robotic boat retracing the 1620 sea voyage of the Mayflower has landed near Plymouth Rock. The sleek Mayflower Autonomous Ship met with an escort boat as it approached the Massachusetts shoreline Thursday, more than 400 years after its namesake's historic journey from England. It was towed into Plymouth Harbor -- per U.S. Coast Guard rules for crewless vessels -- and docked near a replica of the original Mayflower that brought the Pilgrims to America. Piloted by artificial intelligence technology, the 50-foot (15-meter) trimaran didn't have a captain, navigator or any humans on board. The solar-powered ship's first attempt to cross the Atlantic in 2021 was beset with technical problems, forcing it back to its home port of Plymouth, England -- the same place the Pilgrim settlers sailed from in 1620.
NASA's Perseverance rover has been aptly named because -- nearly two months after beginning its search into past life on Mars -- it has still yet to find any viable samples. The car-sized robot began its mission to find ancient biomarkers in the Martian clay on April 22, which could indicate if alien life ever existed on the Red Planet. It has been roaming around an ancient delta to look for sampling sites that might contain ancient microbes and organics. The rover then drills down to extract a specimen that it plans to leave at the base of the delta to be retrieved in future missions. However, NASA has since revealed that, so far, no samples have been successfully collected. The fragile clay materials the rover targets have been known to fracture, crack and crumble during the abrasion and coring process.
Washington, DC (CNN)Tesla (TSLA) is increasingly turning to machines rather than humans as it attempts to develop autonomous vehicles. As part of Tesla's plans to cut 10% of salaried staff, the company has laid off a significant number of its data annotation specialists. These specialists do grunt work that is critical to empowering artificial intelligence systems to handle complex tasks like driving safely down a city street. The layoffs were first reported by Bloomberg Tuesday and confirmed by CNN Business. Data annotation specialists use software tools to manually label objects in video clips collected from Tesla vehicles.
Robotic process automation eliminates the barrier of employees' need of performing mundane tasks and enables them to focus on other productive business activities. In this article, we are going to discuss the top 5 ways that keep you on the right track for RPA bots collaboration. RPA bots are software robots that perform tasks in a digital environment. These bots aim to automate repetitive tasks and therefore, they are also known as the "digital workforce". In traditional automation tools, software developers have to create a list of actions to automate the tasks and interface to back-end systems using application programming interfaces (APIs).
Developed by robotics researchers at the University of Michigan, it could cut learning time for new materials and environments down to a few hours rather than a week or two. In simulations, the expanded training data set improved the success rate of a robot looping a rope around an engine block by more than 40% and nearly doubled the successes of a physical robot for a similar task. That task is among those a robot mechanic would need to be able to do with ease. But using today's methods, learning how to manipulate each unfamiliar hose or belt would require huge amounts of data, likely gathered for days or weeks, says Dmitry Berenson, U-M associate professor of robotics and senior author of a paper presented today at Robotics: Science and Systems in New York City. In that time, the robot would play around with the hose -- stretching it, bringing the ends together, looping it around obstacles and so on -- until it understood all the ways the hose could move.
On June 29th, 2022, a venerable digital industrial and one of the leaders in the metaverse, announced a collaboration to create an industrial metaverse. The goal - accelerate the adoption and mainstream use of industrial automation. At the core of this relationship is Siemens digital transformation platform, Siemens Xcelerator, and NVIDIA's Omniverse, a 3D-design and collaboration platform. Siemens brings physics-based digital models to the partnership while NVIDIA brings its real-tie AI to increase decision velocity. As NVIDIA states, "Omniverse is a multi-GPU scalable virtual world engine that enables teams to connect 3D design and CAD applications for collaborative design workflows and allows users to build physically accurate virtual worlds for training, testing and operating AI agents such as robots and autonomous machines."
'Everything a creator builds is in their own image' - a sentiment we've been fed since forever might actually be true. A robot recently shocked scientists after it became racist and sexist. While such deplorable behaviour is commonly observed among humans, we had better hopes from artificial intelligence. If you expected AI to be impartial and intellectually superior, that's clearly not the case. A recent experiment by researchers from John Hopkins University, Georgia Institute of Technology, and the University of Washington showed how a robot controlled by a machine learning tool began to categorise people based on dangerous stereotypes about race and gender.
When it launched last year, the DJI Mavic 3 grabbed a lot of headlines with features like a big Four Thirds sensor and a second 7X telephoto camera. But it also drew some criticism for going on sale with key features like ActiveTrack and QuickShots still not available. That meant that I and others couldn't assess those features in our early Mavic 3 reviews. And because of that, potential buyers couldn't get a full picture of the drone before paying up to $5,000 for one. Following three major firmware updates in December, January and May, all the promised functions and more are finally here. Now, I'm going to test them out using the same exact drone to see how well they work.
Two fields that are rapidly evolving, advancing and holding infinite promise for the future of humanity are robotics and artificial intelligence (AI). From the simplest of tasks to the most complex and demanding in our everyday lives, advances in robotics and AI have made it possible to create machines that can perform tasks with incredible speed and accuracy. With these two technologies beginning to converge, is the new age upon us in which robots that are more intelligent and capable than ever before are the future of humans? Although robotics and AI have always been in the conversation, the last two decades have seen a rapid rise in their development and application. Fear of the misconception that these machines will take over our jobs has been allayed.