- 15,401
- TRAPPIST-1g
- ProjectWHaT
Been busy so I haven't posted much in this thread. I had a bunch of tabs open for months now that I wanted to post here:
https://www.theverge.com/2020/10/30...driving-car-data-miles-crashes-phoenix-google
This Verge article was published around the time when Tesla updated their Autopilot beta program last year. At the time, a lot of people were very unhappy that they were publically beta testing the algorithm on public roads.
(language warning for this Twitter username)
The article details two papers Waymo published at the time. The first discusses safety and the second reports their data from testing in Phoenix, "this is the first time that Waymo has ever publicly disclosed mileage and crash data from its autonomous vehicle testing operation in Phoenix."
Firstly, they use 3 layers for safety:
Secondly, Waymo says between January and December 2019, their vehicles drove 6.1 million miles and from January 2019 to September 2020, their fully driverless vehicles drove 65,000 miles. During this time period, they had 47 "contact events" of which 29 of them occurred in simulations. None of them resulted in severe injuries and most of them were at the fault of a human driver or pedestrian.
In this article written by the CEO and co-founder of Voyage, he describes how their system makes decisions.
https://news.voyage.auto/teaching-a-self-driving-a-i-to-make-human-like-decisions-a9a9597dd156
Finally, here's a video of CommaAI's OpenPilot driving a truck in American Truck Sim!
https://www.theverge.com/2020/10/30...driving-car-data-miles-crashes-phoenix-google
This Verge article was published around the time when Tesla updated their Autopilot beta program last year. At the time, a lot of people were very unhappy that they were publically beta testing the algorithm on public roads.
(language warning for this Twitter username)
The article details two papers Waymo published at the time. The first discusses safety and the second reports their data from testing in Phoenix, "this is the first time that Waymo has ever publicly disclosed mileage and crash data from its autonomous vehicle testing operation in Phoenix."
Firstly, they use 3 layers for safety:
- Hardware, including the vehicle itself, the sensor suite, the steering and braking system, and the computing platform;
- The automated driving system behavioral layer, such as avoiding collisions with other cars, successfully completing fully autonomous rides, and adhering to the rules of the road;
- Operations, like fleet operations, risk management, and a field safety program to resolve potential safety issues.
Secondly, Waymo says between January and December 2019, their vehicles drove 6.1 million miles and from January 2019 to September 2020, their fully driverless vehicles drove 65,000 miles. During this time period, they had 47 "contact events" of which 29 of them occurred in simulations. None of them resulted in severe injuries and most of them were at the fault of a human driver or pedestrian.
In this article written by the CEO and co-founder of Voyage, he describes how their system makes decisions.
https://news.voyage.auto/teaching-a-self-driving-a-i-to-make-human-like-decisions-a9a9597dd156
High-Quality Decision Making is fueled by two models, one optimization-based (i.e., reliable) and one machine-learned (i.e., intelligent), with each serving different responsibilities. The optimization-based model is responsible for ensuring our vehicle always adheres to the rules of the road (e.g., preventing the running of stop-lines, or getting too close to pedestrians), while the machine-learned model—trained on rich, historical driving data—is responsible for tapping into its vast history of experience to select the most human-like decision to make from a refined list of safe options.
Finally, here's a video of CommaAI's OpenPilot driving a truck in American Truck Sim!