odd specification
Explaining Unreliable Perception in Automated Driving: A Fuzzy-based Monitoring Approach
Salvi, Aniket, Weiss, Gereon, Trapp, Mario
Autonomous systems that rely on Machine Learning (ML) utilize online fault tolerance mechanisms, such as runtime monitors, to detect ML prediction errors and maintain safety during operation. However, the lack of human-interpretable explanations for these errors can hinder the creation of strong assurances about the system's safety and reliability. This paper introduces a novel fuzzy-based monitor tailored for ML perception components. It provides human-interpretable explanations about how different operating conditions affect the reliability of perception components and also functions as a runtime safety monitor. We evaluated our proposed monitor using naturalistic driving datasets as part of an automated driving case study. The interpretability of the monitor was evaluated and we identified a set of operating conditions in which the perception component performs reliably. Additionally, we created an assurance case that links unit-level evidence of \textit{correct} ML operation to system-level \textit{safety}. The benchmarking demonstrated that our monitor achieved a better increase in safety (i.e., absence of hazardous situations) while maintaining availability (i.e., ability to perform the mission) compared to state-of-the-art runtime ML monitors in the evaluated dataset.
- Europe > Germany > Bavaria > Upper Bavaria > Munich (0.04)
- Europe > Finland > Southwest Finland > Turku (0.04)
- Automobiles & Trucks (1.00)
- Transportation > Ground > Road (0.85)
- Information Technology > Robotics & Automation (0.71)
- Information Technology > Artificial Intelligence > Robots > Autonomous Vehicles (0.71)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Uncertainty > Fuzzy Logic (0.70)
- Information Technology > Artificial Intelligence > Machine Learning > Performance Analysis > Accuracy (0.46)
Formalization of Operational Domain and Operational Design Domain for Automated Vehicles
Specifying an Operational Design Domain (ODD) is crucial for safeguarding automated vehicle systems against conditions that exceed their capabilities. Yet, prior definitions of ODD have relied on ambiguous and unclear terms, resulting in numerous misunderstandings and misconceptions. This paper introduces a formal approach to clearly define the Operational Domain (OD) and ODD for automated vehicles. Furthermore, the absence of essential terms, such as the OD, has resulted in the creation of numerous terms that have made things more complicated and confusing. This level of complexity is unacceptable when it comes to developing safety-critical systems, where any uncertainty can lead to significant risks. This study addresses these deficiencies by providing a precise mathematical model of OD and clarifying its relationship with other terms. Also, by formalizing these terms, this work establishes a foundation for developing further concepts such as ODD specification and ODD monitoring, which are explained in this paper.
- North America > United States (1.00)
- South America > Brazil > Rio de Janeiro > Rio de Janeiro (0.04)
- North America > Canada > Ontario > Waterloo Region > Waterloo (0.04)
- Europe > Germany > Lower Saxony > Oldenburg (0.04)
- Transportation > Ground > Road (0.94)
- Government > Regional Government > North America Government > United States Government (0.93)