Loading
Loading
No Validated Framework for Detecting Autonomous Vehicle Edge Cases
Automated vehicles fail unpredictably in rare "edge cases" — scenarios where perception, prediction, or planning systems encounter inputs outside their training distribution. The first systematic review to map edge case detection across all AV subsystems found that no comprehensive detection framework exists. Previous studies addressed perception anomalies (camera/LiDAR failures) or trajectory anomalies (path planning errors) in isolation, never both simultaneously. Knowledge-driven detection approaches using expert domain rules are "largely overlooked." There is no agreed-upon method to even estimate how frequently edge cases occur in real driving, making it impossible to quantify residual risk for regulatory approval.
The fundamental barrier to AV deployment at scale is not average-case performance — most AV systems drive safely 99%+ of the time — but the inability to characterize and detect the remaining fraction where catastrophic failures occur. Regulatory agencies (NHTSA, UNECE) require demonstration of safety equivalence to human drivers, but without a method to enumerate or detect edge cases, this comparison cannot be made rigorously. The Waymo and Cruise incidents that paused AV deployment in 2023–2024 were edge cases that existing detection systems did not flag.
Reconstructive methods (autoencoders) assume anomalies cause higher reconstruction errors — but this assumption is "unproven" in safety-critical contexts and produces both false positives and missed detections. Predefined threshold approaches fail to capture the "nuanced dynamics of every conceivable driving scenario." ML-based anomaly detectors "lack transparency and interpretability," are "prone to overfitting," and fail to "generalize well to new, unseen scenarios." Simulation-based edge case discovery does not reliably transfer to real-world occurrence patterns (sim-to-real gap). Most methods have been applied only to 2D camera images and have not been extended to multimodal sensor fusion (camera + LiDAR + radar), which is how production AV systems actually perceive.
Cross-subsystem detection frameworks that monitor perception, prediction, and planning simultaneously could catch edge cases that single-subsystem monitors miss. Knowledge-driven methods — encoding traffic rules, physics constraints, and common-sense priors — could complement data-driven anomaly detection, catching "known unknowns" rather than waiting for statistical anomalies. Few-shot learning approaches could enable detection of rare edge cases from minimal examples. Federated learning across AV fleets could build diverse edge case datasets without sharing proprietary data.
A team could build a cross-subsystem anomaly detector using publicly available AV datasets (nuScenes, Waymo Open Dataset, KITTI), testing whether joint perception-plus-planning monitoring catches edge cases that single-subsystem monitors miss. A policy-focused team could develop a taxonomy of AV edge case types and propose a quantitative safety case framework. Relevant disciplines: computer science, robotics, safety engineering, transportation policy.
First comprehensive systematic review of AV edge case detection covering both perception and trajectory subsystems. Six unresolved research directions identified: sim-to-real gap, few-shot learning, federated learning, explainability, exposure estimation, and collaborative evaluation. Related briefs: transportation-automated-driving-monitoring (focuses on driver monitoring in L2/L3 vehicles, not perception edge cases), digital-safe-rl-exploration-guarantees (related formal safety challenge). The regulatory constraint applies because no regulatory framework exists for certifying AV safety when edge case frequency is unknown.
Rahmani, S. et al., "A Systematic Review of Edge Case Detection in Automated Driving: Methods, Challenges and Future Directions," arXiv:2410.08491, submitted to IEEE Transactions on Intelligent Transportation Systems, 2024, https://arxiv.org/abs/2410.08491; accessed 2026-02-20