If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
Earlier this year, we open-sourced a research project called AirSim, a high-fidelity system for testing the safety of artificial intelligence systems. AirSim provides realistic environments, vehicle dynamics and sensing for research into how autonomous vehicles that use AI that can operate safely in the open world. Today, we are sharing an update to AirSim: We have extended the system to include car simulation, which will help advance the research and development of self-driving vehicles. The latest version is available now on GitHub as an open-source, cross-platform offering. The updated version of AirSim also includes many other features and enhancements, including additional tools for testing airborne vehicles.
In the meantime, if one of them goes berserk, here's a useful tactic: Shut the door behind you. One after another, robots in a government-sponsored contest were stumped by an unlocked door that blocked their path at an outdoor obstacle course. One bipedal machine managed to wrap a claw around the door handle and open it but was flummoxed by a breeze that kept blowing the door shut before it could pass through. Robots excel at many tasks, as long as they don't involve too much hand-eye coordination or common sense. Like some gifted children, they can perform impressive feats of mental arithmetic but are profoundly klutzy on the playground.
Continued from: "Advanced image sensors take automotive vision beyond 20/20." And there are many others now in the race to process all of that vehicle sensor data. Among them, Toshiba has been evolving its Visconti line of image recognition processors in parallel with increasingly demanding European New Car Assessment Programme (Euro NCAP) requirements. Starting in 2014, the Euro NCAP began rating vehicles based on active safety technologies such as lane departure warning (LDW), lane keep assist (LKA), and autonomous emergency braking (AEB). These requirements extended to daytime pedestrian AEB and speed assist systems (SAS) in 2016.
Next month in San Francisco, Uber will stand trial in federal court for allegedly cheating in the race to commercialize self-driving cars. Google parent Alphabet accuses Uber of stealing designs for sensors called lidars that give a vehicle a 3-D view of its surroundings, an "unjust enrichment" it says will take $1.8 billion to heal. Meanwhile in Toronto, Uber has a growing artificial-intelligence lab led by a woman who's spent years trying to make lidar technology less important. Raquel Urtasun joined Uber to set up a new autonomous-vehicle research lab in May--almost three months after Alphabet filed suit. She still works one day a week in her old job as an associate professor at the University of Toronto.
You're lying on your stomach, with your arms draped forwards, almost like you're going to get a shoulder massage. Except this is not a moment for relaxation. Through a VR headset, you see flashes of color, an unfamiliar view of the world, a group of red lines that looks something like a person. And now you have to make a decision, because you're rolling forward, head first, and your right hand is wrapped around the joystick that determines which way you're going. Do you continue forward, and risk hitting that blob that might be a human being?
Autonomous cars often proudly claim to be fitted with a long list of sensors--cameras, ultrasound, radar, lidar, you name it. But if you've ever wondered why so many sensors are required, look no further than this picture. You're looking at what's known in the autonomous-car industry as an "edge case"--a situation where a vehicle might have behaved unpredictably because its software processed an unusual scenario differently from the way a human would. In this example, image-recognition software applied to data from a regular camera has been fooled into thinking that images of cyclists on the back of a van are genuine human cyclists. This particular blind spot was identified by researchers at Cognata, a firm that builds software simulators--essentially, highly detailed and programmable computer games--in which automakers can test autonomous-driving algorithms.
AT THE Consumer Electronics Show in Las Vegas two years ago a leading car maker unveiled a machine that it said was a vision of the future. It certainly looked the part, with a sleek silver body shell, a steering wheel that retracted into the dashboard and four lounge-style chairs that could rotate to face one other. The most startling feature, though, was its self-driving ability. It was filmed navigating through San Francisco shortly before its futuristic doors swung open to journalists. We stepped onto the car's wooden floor and looked at a calming forest projected onto the windows as the car drove itself along the runway of a nearby airbase.
Video Friday is your weekly selection of awesome robotics videos, collected by your Automaton bloggers. We'll also be posting a weekly calendar of upcoming robotics events for the next two months; here's what we have so far (send us your events!): Let us know if you have suggestions for next week, and enjoy today's videos. A new RoboBee from Harvard can swim underwater, and then launch itself into the air with a microrocket and fly away. At the millimeter scale, the water's surface might as well be a brick wall.
For a moment there, Arizona was the place for autonomous vehicles learning to drive. It's a logical starting point for experimental tech--still in its wobbly, Bambi legs stage--that likes warm weather, little rain, and wide open roads. It's easier for their complicated sensors to "see" there, you see. Arizona is, in other words, a lot like California, without the aggressive Department of Motor Vehicles and its pesky regulations. Governor Doug Ducey has directed all state agencies to make it as easy as possible for fully self-driving cars to test in Arizona, no permitting or reporting required.
The applications of the Internet of Things (IoT) have been growing dramatically in recent a few years. According to IDC, the transportation sector will be among the first to see a significant growth from the IoT, and the global IoT market in the transportation sector is expected to reach $195 billion by 2020. The smart IoT is dramatically accelerating the pace of innovation and transforming the way of operations in transportation and infrastructure. The ubiquitous deployment of smart, connected sensors and things, combined with artificial intelligence (AI) and big data analytics, can enable us to gather insightful knowledge, make real-time and even predictive computing to help us reaching better decisions and developing better plans to improve the safety, efficiency, and reliability of smart transportation. Here we take a look at some important applications of the IoT in intelligent transportation systems and smart cities.