Goto

Collaborating Authors

 ntsb


Utilizing AI for Aviation Post-Accident Analysis Classification

Nanyonga, Aziida, Wild, Graham

arXiv.org Artificial Intelligence

The volume of textual data available in aviation safety reports presents a challenge for timely and accurate analysis. This paper examines how Artificial Intelligence (AI) and, specifically, Natural Language Processing (NLP) can automate the process of extracting valuable insights from this data, ultimately enhancing aviation safety. The paper reviews ongoing efforts focused on the application of NLP and deep learning to aviation safety reports, with the goal of classifying the level of damage to an aircraft and identifying the phase of flight during which safety occurrences happen. Additionally, the paper explores the use of Topic Modeling (TM) to uncover latent thematic structures within aviation incident reports, aiming to identify recurring patterns and potential areas for safety improvement. The paper compares and contrasts the performance of various deep learning models and TM techniques applied to datasets from the National Transportation Safety Board (NTSB) and the Australian Transport Safety Bureau (ATSB), as well as the Aviation Safety Network (ASN), discussing the impact of dataset size and source on the accuracy of the analysis. The findings demonstrate that both NLP and deep learning, as well as TM, can significantly improve the efficiency and accuracy of aviation safety analysis, paving the way for more proactive safety management and risk mitigation strategies.


The final 11 seconds of a fatal Tesla Autopilot crash

Washington Post - Technology News

The sun had yet to rise in Delray Beach, Fla., when Jeremy Banner flicked on Autopilot. His red Tesla Model 3 sped down the highway at nearly 70 mph, his hands no longer detected on the wheel. Seconds later, the Tesla plowed into a semi-truck, shearing off its roof as it slid under the truck's trailer. Banner was killed on impact. Banner's family sued after the gruesome 2019 collision, one of at least 10 active lawsuits involving Tesla's Autopilot, several of which are expected to go to court over the next year. Together, the cases could determine whether the driver is solely responsible when things go wrong in a vehicle guided by Autopilot -- or whether the software should also bear some of the blame.


Former MS state senator's plane had autopilot issues in leadup to near-vertical fatal crash

FOX News

Fox News Flash top headlines are here. Check out what's clicking on Foxnews.com. A small plane had mechanical problems with its autopilot system before it crashed in Arkansas last month and killed a former Mississippi state senator who was flying it, according to a preliminary report by the National Transportation Safety Board. Johnny Morgan, 76, of Oxford, Mississippi, served in the Mississippi Senate from 1984 to 1992. He was the only person aboard the twin-engine Beech King Air E-90 plane when it crashed May 17 in a wooded area in northwestern Arkansas, south of Fayetteville.


Elon Musk will likely face deposition in lawsuit over deadly Tesla Autopilot crash

Engadget

Elon Musk may have to answer detailed questions regarding a fatal 2018 Tesla crash where Autopilot was involved. Judge Evette Pennypacker has ordered Musk to give a three-hour deposition in a lawsuit over the crash, which killed Apple engineer Walter Huang when his Model X plowed into a highway median south of San Francisco. Attorneys for Huang's family want to grill the tech CEO over statements he made about Autopilot's capabilities in the years before the incident. Most notably, the plaintiffs point to a 2016 Code Conference interview (shown below) where Musk maintained that Tesla cars with Autopilot could already drive with "greater safety than a person." They're also concerned about a 2016 self-driving demo video that engineers testified was staged to show features that weren't ready.


Tesla investigation deepens after more than a dozen US 'Autopilot' crashes

The Guardian

US federal regulators are deepening their investigation into Tesla's Autopilot function after more than a dozen Tesla cars crashed into parked first-responder vehicles over a period of four years. The National Highway Traffic Safety Administration (NHTSA) said on Thursday it was upgrading its preliminary investigation, which launched last August, to an "engineering analysis", which is taken before the agency determines a recall. The investigation covers all four Tesla vehicles – Models Y, X, S and 3 – representing about 830,000 vehicles that have been sold in the US. The investigation is focused on Tesla's Autopilot feature, which is supposed to help drivers navigate roads through artificial intelligence, which detects other vehicles. The company instructs drivers to pay attention to the road and keep their hands on the steering wheel while using Autopilot, though some drivers have used Autopilot drunk or sitting in the backseat of the car.


Tesla autopilot stirs U.S. alarm as 'disaster waiting to happen'

The Japan Times

Derrick Monet and his wife, Jenna, were driving on an Indiana interstate in 2019 when their Tesla Model 3 sedan operating on Autopilot crashed into a parked fire truck. Derrick, then 25, sustained spine, neck, shoulder, rib and leg fractures. Jenna, 23, died at the hospital. The incident was one of a dozen in the last four years in which Teslas using this driver-assistance system collided with first-responder vehicles, raising questions about the safety of technology the world's most valuable car company considers one of its crown jewels. Now, U.S. regulators are applying greater scrutiny to Autopilot than ever before.


US probing Autopilot problems on 765,000 Tesla vehicles

Boston Herald

The U.S. government has opened a formal investigation into Tesla's Autopilot partially automated driving system after a series of collisions with parked emergency vehicles. The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the U.S. since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration as part of the probe, 17 people were injured and one was killed. NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action Monday in a posting on its website.


Tesla's Autopilot faces US investigation after crashes with emergency vehicles

The Guardian

The US government has opened a formal investigation into Tesla's Autopilot partially automated driving system after a series of collisions with parked emergency vehicles. The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the US since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration (NHTSA) as part of the investigation, 17 people were injured and one was killed. NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action on Monday in a posting on its website.


Automakers must report crashes involving self-driving and driver-assist systems

Engadget

The National Highway Traffic Safety Administration (NHTSA) has implemented a new policy that will require car companies to report incidents involving semi- and fully autonomous driving systems within one day of learning of an accident. In an order spotted by The Washington Post, NHTSA mandates automakers fill out an electronic incident form and submit it to the agency when one of their systems was active either during a crash or immediately before it. They must report an accident anytime there's a death, an injury that requires hospital treatment, a vehicle that's towed away, an airbag deployment or when a pedestrian and or cyclist is involved. The order covers Level 2 advanced driver-assistance systems to Level 5 fully autonomous vehicles, meaning it includes the gamut of everything from Tesla cars with Autopilot to Waymo taxis. "This action will enable NHTSA to collect information necessary for the agency to play its role in keeping Americans safe on the roadways, even as the technology deployed on the nation's roads continues to evolve," the regulator said.


Home video shows driver entering front door before deadly Tesla crash, NTSB says

USATODAY - Tech Top Stories

Federal investigators said Monday they were able to glean some insights into what might have happened after a fire erupted from a Tesla crash that killed two people in the Houston area in April and destroyed the vehicle's data recorder. . The National Transportation Safety Board released preliminary findings from its probe into the crash, which raised speculation about whether the vehicle's partially self-driving system, Autopilot, was to blame. The speculation stemmed from local authorities saying they were nearly positive that no one was behind the wheel when the vehicle crashed. The NTSB, in its preliminary report, said video footage from the vehicle owner's home security system showed him getting behind the wheel of the Tesla Model S and then slowly exiting the driveway. The vehicle traveled about 550 feet "before departing the road on a curve, driving over the curb, and hitting a drainage culvert, a raised manhole and a tree," according to the NTSB.