Goto

Collaborating Authors

 Delray Beach


Criminals are using Zillow to plan break-ins. Here's how to remove your home in 10 minutes.

FOX News

This material may not be published, broadcast, rewritten, or redistributed. Quotes displayed in real-time or delayed by at least 15 minutes. Market data provided by Factset . Powered and implemented by FactSet Digital Solutions . Mutual Fund and ETF data provided by LSEG .


The final 11 seconds of a fatal Tesla Autopilot crash

Washington Post - Technology News

The sun had yet to rise in Delray Beach, Fla., when Jeremy Banner flicked on Autopilot. His red Tesla Model 3 sped down the highway at nearly 70 mph, his hands no longer detected on the wheel. Seconds later, the Tesla plowed into a semi-truck, shearing off its roof as it slid under the truck's trailer. Banner was killed on impact. Banner's family sued after the gruesome 2019 collision, one of at least 10 active lawsuits involving Tesla's Autopilot, several of which are expected to go to court over the next year. Together, the cases could determine whether the driver is solely responsible when things go wrong in a vehicle guided by Autopilot -- or whether the software should also bear some of the blame.


Teslas with Autopilot Move Closer to Being Recalled

TIME - Tech

Teslas with partially automated driving systems are a step closer to being recalled after the U.S. elevated its investigation into a series of collisions with parked emergency vehicles or trucks with warning signs. The National Highway Traffic Safety Administration said Thursday that it is upgrading the Tesla probe to an engineering analysis, another sign of increased scrutiny of the electric vehicle maker and automated systems that perform at least some driving tasks. An engineering analysis is the final stage of an investigation, and in most cases NHTSA decides within a year if there should be a recall or the probe should be closed. Documents posted Thursday by the agency raise some serious issues about Tesla's Autopilot system. The agency found that it's being used in areas where its capabilities are limited, and that many drivers aren't taking action to avoid crashes despite warnings from the vehicle.


Explainable Fact-checking through Question Answering

Yang, Jing, Vega-Oliveros, Didier, Seibt, Taís, Rocha, Anderson

arXiv.org Artificial Intelligence

Misleading or false information has been creating chaos in some places around the world. To mitigate this issue, many researchers have proposed automated fact-checking methods to fight the spread of fake news. However, most methods cannot explain the reasoning behind their decisions, failing to build trust between machines and humans using such technology. Trust is essential for fact-checking to be applied in the real world. Here, we address fact-checking explainability through question answering. In particular, we propose generating questions and answers from claims and answering the same questions from evidence. We also propose an answer comparison model with an attention mechanism attached to each question. Leveraging question answering as a proxy, we break down automated fact-checking into several steps -- this separation aids models' explainability as it allows for more detailed analysis of their decision-making processes. Experimental results show that the proposed model can achieve state-of-the-art performance while providing reasonable explainable capabilities.


US probing Autopilot problems on 765,000 Tesla vehicles

Boston Herald

The U.S. government has opened a formal investigation into Tesla's Autopilot partially automated driving system after a series of collisions with parked emergency vehicles. The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the U.S. since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration as part of the probe, 17 people were injured and one was killed. NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action Monday in a posting on its website.


U.S. Opens Investigation Into Tesla's Autopilot Driving System

TIME - Tech

The U.S. government has opened a formal investigation into Tesla's Autopilot partially automated driving system after a series of collisions with parked emergency vehicles. The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the U.S. since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration as part of the probe, 17 people were injured and one was killed. NHTSA says it has identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control have hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards. The agency announced the action Monday in a posting on its website.


Rare-Event Simulation for Neural Network and Random Forest Predictors

Bai, Yuanlu, Huang, Zhiyuan, Lam, Henry, Zhao, Ding

arXiv.org Machine Learning

We study rare-event simulation for a class of problems where the target hitting sets of interest are defined via modern machine learning tools such as neural networks and random forests. This problem is motivated from fast emerging studies on the safety evaluation of intelligent systems, robustness quantification of learning models, and other potential applications to large-scale simulation in which machine learning tools can be used to approximate complex rare-event set boundaries. We investigate an importance sampling scheme that integrates the dominating point machinery in large deviations and sequential mixed integer programming to locate the underlying dominating points. Our approach works for a range of neural network architectures including fully connected layers, rectified linear units, normalization, pooling and convolutional layers, and random forests built from standard decision trees. We provide efficiency guarantees and numerical demonstration of our approach using a classification model in the UCI Machine Learning Repository.


Canadian police charged a Tesla owner for sleeping while driving

Engadget

Police in Canada say they recently charged a Tesla Model S owner with driving dangerously for sleeping at his car's wheel. In July, the Royal Canadian Mounted Police (RCMP) say they responded to a speeding complaint on Highway 2 near Ponoka -- a town in Alberta, south of the province's capital of Edmonton. Those who saw the car report it was traveling faster than 140 kilometers per hour (86MPH), with the front seats "completely reclined," and both the driver and passenger seemingly asleep. When a police officer found the 2019 Model S and turned on their emergency lights, the vehicle accelerated to 150 kilometers per hour (about 93MPH) before it eventually stopped. Police initially charged the driver, a 20-year-old man from the province of British Columbia, with speeding and handed him a 24-hour license suspension for driving while fatigued. He was also later charged with dangerous driving and has a court date in December.


Your Tesla could explain why it crashed. But good luck getting its Autopilot data

#artificialintelligence

On Jan. 21, 2019, Michael Casuga drove his new Tesla Model 3 southbound on Santiago Canyon Road, a two-lane highway that twists through hilly woodlands east of Santa Ana. He wasn't alone, in one sense: Tesla's semiautonomous driver-assist system, known as Autopilot -- which can steer, brake and change lanes -- was activated. Suddenly and without warning, Casuga claims in a Superior Court of California lawsuit, Autopilot yanked the car left. The Tesla crossed a double yellow line, and without braking, drove through the oncoming lane and crashed into a ditch, all before Casuga was able to retake control. Tesla confirmed Autopilot was engaged, according to the suit, but said the driver was to blame, not the technology.


A super-fast machine learning model for finding user search intent

#artificialintelligence

In April 2019, Benjamin Burkholder (who is awesome, by the way) published a Medium article showing off a script he wrote that uses SERP result features to infer a user's search intent. The script uses the SerpAPI.com This is one of the coolest ways to estimate search intent, because it uses Google's understanding of search intent (as expressed by the SERP features shown for that search). The one problem with Burkholder's approach is its reliance on the Serp API. If you have a large set of search queries you want to find intent for, you need to pass each query phrase through the API, which then actually does the search and returns the SERP feature results, which Burkholder's script can then classify.