comma ai | Human-Machine Cooperation in openpilot | Weixing Zhang | COMMA_CON talks | Research | Summary and Q&A

TL;DR
This talk discusses how Open Pilot uses machine learning to improve the dynamics between the driver and the driving model, including driver monitoring, improved ground truth, adaptive policies, and feedback mechanisms.
Key Insights
- 🪛 The switch to the Comma 3 camera has significantly improved driver monitoring capabilities, including better face detection and the ability to detect driving patterns.
- ❤️🩹 End-to-end driver monitoring allows the model to directly predict whether the driver is distracted or paying attention, leading to better detection accuracy.
- 🥺 Analyzing driving behavior with metrics like harsh acceleration and braking can lead to improvements in the driving model and a smoother driving experience.
- 👤 Open Pilot users tend to be less distracted and pay more attention to the road when engaged with the driving system.
- 👤 Long-term metrics show that Open Pilot users do not become more distracted over time, debunking the belief that users become complacent with the system.
- 👻 Automatic disengagement reports help identify areas where the driving model may need improvement, allowing for targeted fixes and updates.
Transcript
Read and summarize the transcript of this video on Glasp Reader (beta).
Questions & Answers
Q: How does Open Pilot handle driver monitoring when the driver's eyes are not visible, for example when wearing sunglasses?
When the driver's eyes are not visible, Open Pilot falls back on head pose detection. If the model detects unusual poses or behaviors, it can accurately identify when the driver is distracted or not paying attention.
Q: Will the investment in driver monitoring decrease as we move towards Level 4 and Level 5 autonomy?
Open Pilot will continue to require driver attention, but there may be instances where the model can adapt and alert the driver only when necessary. The goal is to find a balance between driver monitoring and the capabilities of the driving model.
Q: How are drivers classified as "chill" or not, and are there any specific data breakdowns for different driving styles?
Drivers are classified into five categories based on their driving style, including metrics like harsh acceleration, hard braking, and hard turning. While specific data breakdowns may not be tracked for individual drivers, the driving model can adapt to individual driving styles.
Q: Can the driving model communicate back to the driver about its uncertainty or areas where it does not want to go?
There are plans to show uncertainty in the user interface by using a confidence indicator. For example, if the model detects potential dangers or uncertain situations, it can alert the driver with a dynamic indicator to draw their attention.
Summary & Key Takeaways
-
Open Pilot is a driving assistant system that requires the user to be able to take over at all times, and this talk focuses on the progress made in driver monitoring since the last update.
-
The introduction of the Comma 3 camera has allowed for better driver monitoring, with almost 100% face detection rate and the ability to detect whether the driver is driving on the right side or the wrong side of the road.
-
The talk also discusses the shift towards an end-to-end approach for driver monitoring and how the model can learn from the driver's behavior, leading to improvements in true positive detection.
Share This Summary 📚
Explore More Summaries from george hotz archive 📚





