If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
In its race to embrace driverless vehicles, Washington has cleared away regulatory hurdles for auto companies and brushed aside consumer warnings about the risk of crashes and hacking. But at a recent hearing, lawmakers absorbed an economic argument that illustrated how the driverless revolution they are encouraging could backfire politically, particularly in Trump country. It was the tale of a successful, long-distance beer run. A robotic truck coasted driverless 120 miles down Interstate 25 in Colorado on its way to deliver 51,744 cans of Budweiser. Not everyone at the hearing was impressed by the milestone, particularly the secretary-treasurer of the Teamsters, whose nearly 600,000 unionized drivers played no small roll in President Trump's victory last year.
The most dangerous part of any car, say the experts, 'is the nut behind the steering wheel'. Human error is to blame for most accidents, so remove that'nut' and let the car drive itself and many lives will be saved, runs the argument now pushed by ministers, manufacturers and supporters of what is known as'autonomous driving'. And it certainly seems as if it's full speed ahead for the driverless car. The Prime Minister Theresa May and Chancellor Philip Hammond yesterday confirmed plans -- widely trailed ahead of the Budget tomorrow -- to invest £900 million to deliver'fully driverless cars' by 2021. But is the Government right to be putting its foot on the accelerator?
The company says they're deploying cars without backup drivers. The company says they're deploying cars without backup drivers. A new study is bolstering the case for putting more autonomous vehicles on the road sooner rather than later -- at the same time that self-driving cars are hitting a milestone in parts of the Phoenix metropolitan area. A research report released this week argues that deploying driverless cars commercially as soon as they become at least a little safer than human drivers, could end up saving hundreds of thousands of lives -- as compared to waiting for the technology to be close to perfect. Meanwhile, on the roads in Arizona, the first public tests of self-driving cars without backup drivers have begun.
A driverless shuttle bus crashed less than two hours after it was launched in Las Vegas on Wednesday. The city's officials had been hosting an unveiling ceremony for the bus, described as the US' first self-driving shuttle pilot project geared towards the public, before it crashed with a semi-truck. According to the Las Vegas Review-Journal, the human driver of the other vehicle was at fault, there were no injuries, and the incident caused minor damage. The oval-shaped shuttle -- sponsored by AAA, the Review-Journal added -- can transport up to 12 passengers at a time. It has an attendant and a computer monitor, and uses GPS and electric curb sensors instead of brakes or a steering wheel.
Most of us will know the age old saying, we want to be "safe and secure" – that's ourselves, our family, and our work colleagues in all aspects of life. However, our understanding of what it means to be safe and secure, especially when considering today's modern digital age and in particular the growing era of the Internet of Things (IoT), isn't the same as it once was. For sure, the natural evolution of innovation, technological or otherwise, continues irrespective of the accelerating awareness and adoption of the interconnection of consumer and industrial devices which makes up the IoT. The world of Industrial Internet of Things (IIoT) is also evolving at a similar pace and now more than ever is bridging into consumers' lives on an individual level. So much so that it is becoming more difficult to differentiate the IoT from IIoT, outside of those in the industry of course.
Self-driving cars are being tested all over the United States. New York City, Sacramento, and San Francisco are just some of the places you can see autonomous vehicles on the road. Waymo, Google's self-driving car division, has been a leader of the tech. They recently partnered with Intel to further hone what their vehicles can do. The CEO of Waymo, John Krafcik, recently wrote a Medium post detailing where the company will be testing their cars next: Michigan.
The National Highway Traffic Safety Administration has suggested a set of 28 "behavioral competencies," or basic things an autonomous vehicle should be able to do. Some are exceedingly basic ("detect and respond to stopped vehicles," "navigate intersections and perform turns"); others, more intricate ("respond to citizens directing traffic after a crash.) "This overview of our safety program reflects the important lessons learned through the 3.5 million miles Waymo's vehicles have self-driven on public roads, and billions of miles of simulated driving, over the last eight years," Waymo Chief Executive John Krafcik said in a letter Thursday to U.S. Transportation Secretary Elaine Chao. "You can't expect to program the car for everything you're possibly going to see," said Ron Medford, Waymo's safety director and a former senior National Highway Traffic Safety Administration official.
OK, sure, there are self-driving cars on California roads today. General Motors' Cruise has Chevrolet Bolts zipping around San Francisco; Google self-driving spinoff Waymo has got Chrysler Pacifica motoring about Mountain View; secretive startup Zoox has black Toyota Highlanders mixing it up along San Francisco's Embarcadero. But all these vehicles, however capable, have a decidedly un-futuristic feature: There's a human in the driver's seat, ready to grab control in case the robot goes rogue. It's not just common sense, it's the law. California's Department of Motor Vehicles requires that safety driver to be there.
California officials Wednesday unveiled new regulations that would allow autonomous vehicles to operate on state roads in test projects without a human operator. The new rules come with a growing number of tech firms and automakers testing self-driving vehicles, and follows new guidelines from the federal government aimed at spurring the technology widely believed to improve road safety and reduce accidents. DMV director Jean Shiomoto said the agency hopes to finalize its regulations by the end of the year and noted that 42 companies have permits to test autonomous vehicles in the states. The state agency said any autonomous vehicles would need to meet federal safety standards.
California's existing regulations, which require a human driver behind the wheel even when completely driverless cars are being tested, have been criticized by industry leaders and some politicians as too strict. DMV officials said Wednesday that the federal government will continue to set safety standards for automobiles, while the state's role is to make sure vehicles traveling on state highways conform to federal standards. The new regulations would require that manufacturers testing driverless cars on California roads certify that they're meeting federal standards and that any public paperwork shared with federal regulators on driverless testing is also passed to the DMV. The new regulations would trim back existing rules that require municipalities to approve vehicle testing.