It will soon become easy for self-driving cars to hide in plain sight. The rooftop lidar sensors that currently mark many of them out are likely to become smaller. Mercedes vehicles with the new, partially automated Drive Pilot system, which carries its lidar sensors behind the car's front grille, are already indistinguishable to the naked eye from ordinary human-operated vehicles. Is this a good thing? As part of our Driverless Futures project at University College London, my colleagues and I recently concluded the largest and most comprehensive survey of citizens' attitudes to self-driving vehicles and the rules of the road.
We love it when people exceed expectations. Whether it's an athlete who steps up to replace an injured starter or a team that pulls together to deliver exceptional results, it is inspiring to see long-held assumptions about potential turned upside down. Now, service organizations have an opportunity to exceed traditional expectations in the same way. Instead of being considered simply a means of connection and cost containment post-customer purchase, intelligent service teams can become a strategic driver to direct value back to the business. Focusing on speed, insights, and accuracy, SAP Service Cloud resolves customer issues at unmatched speed -- protecting the brands promise and securing future growth.
Just a few years ago, artificial intelligence stirred our imagination via the voice of Arnold Schwarzenegger from "Terminator" or agent Smith from "The Matrix". It wasn't long before the rebellious robots' film dialogue replaced the actual chats we have with Siri or Alexa over our morning cup of coffee. Nowadays, artificial intelligence is more and more boldly entering new areas of our lives. The automotive industry is one of those that are predicted to speed up in the coming years. By 2030, 95-98% of new vehicles are likely to use this technology.
The Seoul Metropolitan Government (SMG) has announced it is building a pilot driving zone for autonomous cars. Forming part of the cooperative intelligent transport system (C-ITS) construction project, the virtual reality autonomous driving simulator will reflect road, traffic, and weather conditions by using digital twin technologies. According to SMG, by expanding the virtual territory to Gangnam and the city centre, it will enable Seoul to "leap forward" as a city of commercialised self-driving vehicles. The autonomous driving simulator will be open to the public, and anyone from companies to research institutes, start-ups, and universities can use it free of charge. SMG's rationale is the greater the numbers of developers who test the simulator the more opportunity there is to improve their technologies, and help the industry to further advance.
Steven J. Vaughan-Nichols, aka sjvn, has been writing about technology and the business of technology since CP/M-80 was the cutting edge, PC operating system; 300bps was a fast Internet connection; WordStar was the state of the art word processor; and we liked it. Linux has long played a role in cars. Some companies, such as Tesla, run their own homebrew Linux distros. Audi, Mercedes-Benz, Hyundai, and Toyota all rely on Automotive Grade Linux (AGL). AGL is a collaborative cross-industry effort developing an open platform for connected cars with over 140 members.
AI Researcher, Cognitive Technologist Inventor - AI Thinking, Think Chain Innovator - AIOT, XAI, Autonomous Cars, IIOT Founder Fisheyebox Spatial Computing Savant, Transformative Leader, Industry X.0 Practitioner What do you think of the update to the SAE's levels of autonomous driving? Do you find these levels helpful when it comes to knowing what an AV can do? What's the difference between driver support features and automated driving? Society of Automotive Engineers (SAE) recognise that levels 0-2 are better defined as'driver support features.' Level 3 and above encompass what they would now refer to as'automated driving features.' a six degrees of automated driving: from zero automation to full automation.
Humanity has been waiting for self-driving cars for several decades. Thanks to the extremely fast evolution of technology, this idea recently went from "possible" to "commercially available in a Tesla". Deep learning is one of the main technologies that enabled self-driving. It's a versatile tool that can solve almost any problem – it can be used in physics, for example, the proton-proton collision in the Large Hadron Collider, just as well as in Google Lens to classify pictures. Deep learning is a technology that can help solve almost any type of science or engineering problem. CNN is the primary algorithm that these systems use to recognize and classify different parts of the road, and to make appropriate decisions. Along the way, we'll see how Tesla, Waymo, and Nvidia use CNN algorithms to make their cars driverless or autonomous. The first self-driving car was invented in 1989, it was the Automatic Land Vehicle in Neural Network (ALVINN). It used neural networks to detect lines, segment the environment, navigate itself, and drive. It worked well, but it was limited by slow processing powers and insufficient data.
At Woven Planet Level 5, we're using machine learning (ML) to build an autonomous driving system that improves as it observes more human driving. This is based on our Autonomy 2.0 approach, which leverages machine learning and data to solve the complex task of driving safely. This is unlike traditional systems, where engineers hand-design rules for every possible driving event. Last year, we took a critical step in delivering on Autonomy 2.0 by using an ML model to power our motion planner, the core decision-making module of our self-driving system. We saw the ML Planner's performance improve as we trained it on more human driving data.
It may sound like something straight out of Star Wars, but Hyundai's planned'walking car' is a step closer to reality after the vehicle manufacturer unveiled a new $20 million (£16 million) development centre to expedite its arrival. The aim of the New Horizon Studio, which has opened in Montana in the US, is to build vehicles for future customers who want or need to travel over terrains which are challenging for conventional ground vehicles. It will focus on the development of Ultimate Mobility Vehicles (UMVs), including a car with legs that can simply walk over anything it struggles to drive over. The Elevate concept, which resembles the All Terrain Armoured Transport (AT-AT) walkers found in the Star Wars universe, combines a traditional wheel with a leg that unfolds for dangerous terrain. It may sound like something straight out of Star Wars, but Hyundai's planned'walking car' (pictured in a concept image) is a step closer to reality after the vehicle manufacturer unveiled a new $20 million (£16 million) development centre to expedite its arrival Gwen Stefani enjoys celebrating Mother's Day with her family Bono performs'with or without you' in Kyiv after Zelensky invite Aussies shows what Dubai McDonald's looks like Its aim is to address challenging driving situations and potentially save lives as the first responder in natural disasters.
Toyota Motor North America (TMNA) is partnering with Invisible AI to deploy artificial intelligence (AI) in its factories to enhance efficiency and safety. The computer vision platform of the Texas-based company will be installed in 14 Toyota factories in North America. The AI will analyze manufacturing operations to detect any technical issues, revealing the invisible problems to the human eye and cameras and fixing them to improve processes' quality and safety. According to Forbes, Toyota aims to apply computer vision technology to accurately review the assembly process and reduce the time to find inefficiencies. Under the two-year agreement, Toyota factories will be equipped with a system consisting of 500 AI devices using NVIDIA processors and a high-resolution 3D camera to observe operations.