If you are looking for an answer to the question What is Artificial Intelligence? and you only have a minute, then here's the definition the Association for the Advancement of Artificial Intelligence offers on its home page: "the scientific understanding of the mechanisms underlying thought and intelligent behavior and their embodiment in machines."
However, if you are fortunate enough to have more than a minute, then please get ready to embark upon an exciting journey exploring AI (but beware, it could last a lifetime) …
This photo provided by Tesla shows a 2017 Tesla Model 3, a vehicle that has a semiautonomous driving system called Autopilot. Tesla can update the Autopilot software over the air, not necessitating a trip to a service center. Tesla offers Autopilot on its Model S, Model X and Model 3 vehicles. Tesla's Autopilot and similar smart tech in new cars are meant to assist you. They do not, we repeat, not turn your vehicle into a self-driving car.
One day last summer, Microsoft's director of artificial intelligence research, Eric Horvitz, activated the Autopilot function of his Tesla sedan. The car steered itself down a curving road near Microsoft's campus in Redmond, Washington, freeing his mind to better focus on a call with a nonprofit he had cofounded around the ethics and governance of AI. Then, he says, Tesla's algorithms let him down. "The car didn't center itself exactly right," Horvitz recalls. Both tires on the driver's side of the vehicle nicked a raised yellow curb marking the center line, and shredded.
Rules were made to be broken. Unless you're up against, say, special prosecutor Robert Mueller or the frightful might of the Federal Aviation Administration. Turns out that in transportation, like so many other things, regulators can serve as a thick, strong wall, crushing even the most delightful (or horrifying) of innovations with the weight of concrete. Observe Uber's self-driving car program, which, as senior writer Jack Stewart explains, will definitely have to contend with the FAA before it gets off the ground. Or the shaky future of new and potentially life-saving semiautonomous vehicle tech in the US, which carmakers are loath to unleash in a country without firm rules about licensing and liability.
The driver of a Tesla Model S crashed into a fire truck while driving down a California highway. SAN FRANCISCO -- If you want proof that people will push the limits of a technology, even if at risk to their lives, look no farther than last week's crash of a Tesla Model S in Utah. According to a report issued Wednesday by police in South Jordan, a suburb of Salt Lake City, the 28-year-old woman at the wheel of the $100,000 electric sedan engaged Autopilot -- Tesla's driver-assist software that requires driver oversight -- and then didn't touch the steering wheel for 80 seconds. Until she hit a stopped fire struck at 60 mph. That she walked away with only a broken foot likely warrants a separate story on how the Model S can handle a crash.
Police said the driver who crashed her Tesla into the back of a stopped fire truck in Utah last week had her hands off the steering wheel at the time, confirming the woman's claim that vehicle's Autopilot feature was engaged. The 28-year-old had her hands off the wheel for 80 seconds up until the May 11 crash in South Jordan, Utah, police said Wednesday, citing Tesla's official crash report. Data recovered from the woman's Tesla Model S showed more than a dozen instances where she had taken her hands off of the steering wheel during the drive cycle before the crash, according to the South Jordan Police Department. "On two such occasions, she had her hands off the wheel for more than one minute each time and her hands came back on only after a visual alert was provided," the report said. "Each time she put her hands back on the wheel, she took them back off the wheel after a few seconds."
A Utah driver turned on the semi-autonomous functions of her Tesla vehicle and then didn't touch the steering wheel again for 80 seconds before slamming into a firetruck stopped at a red light last week, a summary of data from the car released Wednesday showed. The National Highway Traffic Safety Administration has sent its special crash investigations team to the state, the agency said as details about the Friday evening crash became public Wednesday. According to South Jordan police's summary of technician findings, the 28-year-old driver had repeatedly enabled and disabled the Autopilot features of her Tesla Model S throughout the course of her drive. She took her hands off the wheel more than a dozen times, twice for more than a minute each. The driver re-enabled Autopilot 1 minute and 22 seconds before the crash, let go of the wheel 2 seconds later and then didn't touch the wheel again before hitting the truck at 60 mph (97 kph).
The National Transportation Safety Board is investigating a crash and fire involving a Telsa Model S car. Two teens died in Fort Lauderdale, Florida crash on Tuesday. The probe is not expected to involve Tesla's semi-autonomous Autopilot system. You've heard of Tesla Autopilot, but perhaps not always in a good way: The semi-autonomous driving system is now under investigation by the National Transportation Safety Board for the role it may have played in a March fatal accident near Mountain View, Calif. But you might not have heard about Cadillac Super Cruise and Nissan ProPilot Assist, two other semi-autonomous driving systems that are available in new cars today.
When Venetian merchants hauled the first shipments of a popular Ottoman drink called coffee into 17th century Europe, leaders in the Catholic Church did not exult at the prospect of increased productivity at the bottom of a warm cuppa. So they asked Pope Clement VIII to declare coffee "the bitter invention of Satan." The pontiff, not one to jump to conclusions, had coffee brought before him, sipped, and made the call. "This Satan's drink is so delicious that it would be a pity to let the infidels have exclusive use of it," he declared, the (perhaps apocryphal) story goes. Which is all to say: Sometimes, people are so scared of change, they get things very wrong.
The death of a pedestrian who was struck by an autonomous vehicle in Tempe, Arizona, has brought fresh scrutiny to the accelerating development of self-driving cars. The accident on March 18 is bound to be studied exhaustively, both to determine fault and to assess and refine the overall safety of autonomous systems. According to accounts of the accident, the vehicle, outfitted to test Uber's autonomous driving system, struck a woman at night as she pushed her bicycle across a road outside of a designated crosswalk. Video of the crash, released by Tempe police, shows a woman emerging from a darkened area seconds before she was struck; in the same span of time, the safety driver looks down multiple times for reasons that aren't clear. Uber pledged its full cooperation in the unfolding investigation but has already reached a settlement with some of the victim's family members, while others have come forward, according to multiple news reports.
A Tesla sedan with a semi-autonomous Autopilot feature rear-ended a fire department truck at 60 mph (97 kph) apparently without braking before impact on May 11, 2018, but police say it's unknown if the Autopilot feature was engaged. SAN FRANCISCO -- Data from the computer brain of a Tesla Model S that crashed in Utah last week confirms that the $100,000 sedan was in Autopilot mode, police in South Jordan said Wednesday. Information recovered by Tesla engineers and shared with South Jordan police confirms many of the details the driver, a 28-year-old woman from Lehi, Utah, shared with investigators after her car slammed into a stopped fire truck at 60 mph. She also said she had been distracted by her phone. Earlier Wednesday, the National Highway Traffic Safety Administration said it was sending investigators to Utah and would "take appropriate action based on its review."