Author Topic: Tesla crashes and politics  (Read 3041 times)

LetterRip

  • Members
    • View Profile
Tesla crashes and politics
« on: August 16, 2021, 02:06:56 PM »
So,

Quote
US government regulators are opening an investigation into Tesla’s Autopilot system after cars using the feature crashed into stopped emergency vehicles.

The National Highway Transportation Safety Administration announced the investigation today, and it encompasses 765,000 Teslas sold in the US, a significant fraction of all of the company’s sales in the country. The agency says the probe will cover 11 crashes since 2018; the crashes caused 17 injuries and one death.

https://arstechnica.com/cars/2021/08/us-investigates-autopilot-after-11-teslas-crashed-into-emergency-vehicles

So one of the forum readers ran the math,

Quote
Putting this in context in any year there are typically:

6500 crashes involving EMS/ambulance
3100 crashes involving fire trucks

It's harder to get crash numbers for police vehicles, but they have around 300 police-involved traffic fatalities every year. If we estimate based on the national dataset of all crashes, about 0.5% of crashes are fatal. That means about 60,000 police vehicle crashes every year.

So, we can estimate around 70,000 emergency vehicle crashes per year in the USA. The number of Tesla-involved crashes appears to be 11 over a ~3 year timeframe. So we're comparing against 210000 overall emergency vehicle crashes.

So, based on those numbers 0.005% of emergency vehicle crashes involved a Tesla on Autopilot.

The US has about 275,000,000 vehicles. So, take the 785k Teslas, a bit more math: Tesla represents 0.28% of vehicles on the road.

If we assume Autopilot is used half the time, we can therefore estimate that a "fleet average" vehicle is ~30x as likely to be in a crash with an emergency vehicle when compared to a Tesla on Autopilot. Probably need to further reduce that based on a weighted average of Tesla fleet share over the same timeframe, which likely would reduce it to somewhere around ~20x.

https://arstechnica.com/cars/2021/08/us-investigates-autopilot-after-11-teslas-crashed-into-emergency-vehicles/?comments=1&post=40143857#comment-40143857

So Tesla's on autopilot are roughly 20-30x less likely to be in an accident with an emergency vehicle.  You'd think the NHTSA would have ran the math themselves and realized how much less likely such accidents are with a Tesla.


TheDrake

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #1 on: August 16, 2021, 03:49:00 PM »
I think you'd have to account for other factors.

As one example, autopilot would be less likely to have an accident under adverse weather conditions when it would be disabled, which would be likely to be weighted in the case of human accidents.

Then there's locations, performance of the car in general, etc.

The key idea would be examining which is safer under identical or very similar conditions.

msquared

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #2 on: August 16, 2021, 03:55:19 PM »
I think some of the problem is that people think Auto pilot will never have an accident.  Ever. It must be perfect at all times in all conditions.  If it is not perfect at all times and conditions, it should never be used because humans are so much better at driving.  (sarcasm for those who did not hear it in my typing).

TheDrake

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #3 on: August 16, 2021, 04:57:15 PM »
Let's also not get it twisted. Flying into an accident scene is the responsibility of the human operator, who is supposed to take over from autopilot in such a scenario. This is not an autonomous vehicle.

It would be like blaming an airplane autopilot for not maneuvering around a dangerous storm cell.


Fenring

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #4 on: August 16, 2021, 06:19:54 PM »
At a certain point when driving AI exceeds (by a fair margin) human skills behind the wheel, the regulatory bodies need to do an apples to apples comparison, and if they're going to hold AI to a standard then human drivers must be held to the same standard. If the margin of crashing has to be unbelievably low, then many or even most human drivers shouldn't pass muster either if the AI can't. I am actually fully in favor of - eventually - having human drivers replaced by AI fully, with no option to take the wheel. It's like a bus: if the driver has to step down, you can't just go start driving it, you have to get out. In the interim when humans are always expected to be on guard to catch the AI in case it falls, the margin needs to be maybe better than an average human driver, but asking for perfection reeks of special interests jamming up the system for their personal advantage.

Seriati

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #5 on: August 17, 2021, 06:26:20 PM »
The assumptions made are not reasonable, which dramatically alters the results.

Quote
Putting this in context in any year there are typically:

6500 crashes involving EMS/ambulance
3100 crashes involving fire trucks

It's harder to get crash numbers for police vehicles, but they have around 300 police-involved traffic fatalities every year. If we estimate based on the national dataset of all crashes, about 0.5% of crashes are fatal. That means about 60,000 police vehicle crashes every year.

So, we can estimate around 70,000 emergency vehicle crashes per year in the USA. The number of Tesla-involved crashes appears to be 11 over a ~3 year timeframe. So we're comparing against 210000 overall emergency vehicle crashes.

While that sounds like it makes sense in reality it doesn't. 

Tesla is being investigated because they're hitting parked emergency vehicles in freeway conditions.  That's not "210000" emergency vehicle crashes, that's a small fraction of vehicle crashes.  There are stats out there on fatalities of emergency responders during work on freeways.  It's a disproportionate part of the fatalities from all vehicle involved accidents related to emergency services.  But it's disproportionate specifically because of the higher death rate per accident than other types of accidents. 

I mean Ambulances frequently are involved in accidents when they cross intersections against the lights - when is a Tesla ever autodriving in that circumstance?  Fire truck accidents with fatalities frequently involve the truck rolling over - not something that is triggered by a Tesla or any other car hitting the truck.  By far the largest portion of all emergency vehicle accidents involve accidents with the vehicle in motion (not directly the same but accidents are during emergency response for Ambulances at over a  60% rate and fire trucks at over a 70% rate).  Police cars are frequently involved in accidents during pursuit.  Autopilot is hitting parked emergency vehicles and that's a very specific sub-set of all accidents.

Autodrive is primarily used on limited access freeways (estimates are that over 90% of all miles driven by autopilot are on such freeways).  The vast majority of emergency vehicle accidents involve moving vehicles in most case not on the freeway.  Freeway accidents are less common per mile than other accidents.  Autodrive is not driving Teslas on busy city streets or other areas where emergency vehicles are most frequently in accidents.

It's certainly possible that just recognizing that is enough to break the math.  Teslas could be a disproportionate portion of the accidents involving stopped emergency vehicles on freeways.  I didn't find the stats on how common those accidents are, largely because the stats are focused on how disproportionately fatal they are.  Emergency responders are at their most vulnerable working on freeway emergencies.  Police, fire fighters and medics, and tow truck operators are all struck and killed or struck and severely injured at a greater rate in that type of accident (many accidents not in that context are more dangerous for the other vehicle).

I did not review the underlying report, but this summary is useful.  https://www.arnolditkin.com/personal-injury-blog/2018/february/statistics-on-emergency-vehicle-accidents-in-the/  here's one on fatalities (could not review their underlying report) https://www.respondersafety.com/news/struck-by-incidents/2019-ersi-struck-by-vehicle-fatality-report/

So anytime an analysis starts by over-counting the occurrence rate by including a large number of events that occurred in situations where the test condition does not apply (i.e., including a large number of non-freeway accidents where autodrive is not  engaged, and then assuming an even distribution on autopilot usage of 50% despite that autodrive usage is highly correllated with freeways where the relevant accidents occur and ignoring that in most cases autodrive is monitored by an active driver that may be "saving" autodrive from itself), it makes me question their conclusions.  Are they just not deep thinkers?  Or, do they have a purpose in what they present?

Quote
So, based on those numbers 0.005% of emergency vehicle crashes involved a Tesla on Autopilot.

But what percentage of the relevant crashes involve a Tesla on autopilot?  It could be a significant number.  If you look at the ERSI link, they analyzed the 44 emergency responders killed in roadway accidents in 2019.  Check out this quote from the NY Times:

Quote
The new investigation comes on top of reviews the safety agency is conducting of more than two dozen crashes involving Autopilot. The agency has said eight of those crashes resulted in a total of 10 deaths.
https://www.nytimes.com/2021/08/16/business/tesla-autopilot-nhtsa.html

While that doesn't prove anything, as it may not be comparing apples to apples, it is easy to see why autopilot crashes on the Freeway could be of particular concern if there are less than 50 responders killed by them in an average year.  Even adding 5 a year for a vehicle that the authors claim is 0.28% of all vehicles on the road, where there is still a human driver monitoring them most of the time, would be evidence of a massive problem.  Given that the Tesla reported statistics do not include any situation where the human driver intervened and overrode autopilot to slow down, it is actually possible that the Tesla autopilot crashes involve an alarming percentage of the Teslas where the autodrive was the only driver paying attention.

Quote
The US has about 275,000,000 vehicles. So, take the 785k Teslas, a bit more math: Tesla represents 0.28% of vehicles on the road.

If we assume Autopilot is used half the time, we can therefore estimate that a "fleet average" vehicle is ~30x as likely to be in a crash with an emergency vehicle when compared to a Tesla on Autopilot.  Probably need to further reduce that based on a weighted average of Tesla fleet share over the same timeframe, which likely would reduce it to somewhere around ~20x.

And see, building fault upon fault they get to an a conclusion that has little actual value.  Claiming 50% usage when the usage is directly correllated to the type of driving, which is directly correllated to the type of emergency vehicle accident, without any attempt to rationalize those, makes the claims virtual nonsense.

So Tesla with autopilot could in fact be "30 times less likely"to be in an accident with an emergency vehicle  (which is actually doubtful) and at the same time be 50, 100, 250 times (who knows really) more likely to be in an accident with a parked emergency vehicle on a freeway. 

That difference is why its being investigated.

You should also take a look at this article, which is pro-Tesla but points out that the safety numbers may not be all they appear:  https://www.forbes.com/sites/bradtempleton/2020/07/28/teslas-arent-safer-on-autopilot-so-researchers-calling-for-driver-monitoring-may-be-right/?sh=5f279471d739  This was my source on the claim that over 90% of the autopilot's miles driven are on freeways.

LetterRip

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #6 on: August 17, 2021, 09:39:56 PM »
So a clarification,

Quote
This subset of Tesla Autopilot crashes is important to NHTSA because they all involved cases where first responders were active, the agency said, "including some that crashed directly into the vehicles of first responders."

https://www.caranddriver.com/news/a37320725/nhtsa-investigating-tesla-autopilot-crashes-fatalities/

So it is actually not just cases of crashing into first responder vehicles (I was wondering where the 11 number came from since any Tesla crash tends to get widely reported), but rather any crash into a first responder vehicle or near scenes where first responders were active.  Now the number of scenes that 'first responders are active' is far far more than the number of times that first responder vehicles have been crashed into.


NobleHunter

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #7 on: August 18, 2021, 09:05:41 AM »
A possible confound is that components like air bags can be investigate and recalled for small numbers of failures. If it is 11 incidences or whatever that doesn't seem out of line with numbers I see cited justifying recalls. So a process is being applied according to an existing standard that might not be well suited to this specific situation.

On the other hand, I doubt Tesla will voluntarily keep pushing autopilot reliability to make it as safe as possible. Safer than a human driver is a relatively low bar and I think only this kind of pressure will make Tesla and other self-driving car makers push for the most reliable car possible, or even practicable.

yossarian22c

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #8 on: August 18, 2021, 09:13:04 AM »
I remember a few years ago that Tesla autodrive had an issue with stopped vehicles or other barriers in highway lanes. So the issue here is that on autodrive when the driver has gone to sleep or is otherwise distracted that it will hit stopped objects at full speed. Not looking it up right now but I think there was a case in Florida where a man died when his Tesla hit a highway barrier that was part of a lane closure for road construction.

LetterRip

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #9 on: August 18, 2021, 10:55:40 AM »
I remember a few years ago that Tesla autodrive had an issue with stopped vehicles or other barriers in highway lanes. So the issue here is that on autodrive when the driver has gone to sleep or is otherwise distracted that it will hit stopped objects at full speed. Not looking it up right now but I think there was a case in Florida where a man died when his Tesla hit a highway barrier that was part of a lane closure for road construction.

This happens if you fall asleep or are distracted using any cruise control.

Seriati

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #10 on: August 18, 2021, 12:53:07 PM »
So a clarification,

Quote
This subset of Tesla Autopilot crashes is important to NHTSA because they all involved cases where first responders were active, the agency said, "including some that crashed directly into the vehicles of first responders."

https://www.caranddriver.com/news/a37320725/nhtsa-investigating-tesla-autopilot-crashes-fatalities/

So it is actually not just cases of crashing into first responder vehicles (I was wondering where the 11 number came from since any Tesla crash tends to get widely reported), but rather any crash into a first responder vehicle or near scenes where first responders were active.  Now the number of scenes that 'first responders are active' is far far more than the number of times that first responder vehicles have been crashed into.

But that actually makes the analysis worse, because that's a small set of crashes, and yet they used the all inclusive set of total incidents involving an emergency vehicle to get to the 210000 they used on the other side of the ratio from the 11.  That's almost literally manipulating the statistics to make them appear favorable.

And again, the question is whether Teslas are disproportionately represented in the group that actually crash into those scenes.  When the analysis is manipulated to make that number appear tiny it makes you wonder about the authors' intent in crafting that analysis.

yossarian22c

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #11 on: August 18, 2021, 01:02:59 PM »
I remember a few years ago that Tesla autodrive had an issue with stopped vehicles or other barriers in highway lanes. So the issue here is that on autodrive when the driver has gone to sleep or is otherwise distracted that it will hit stopped objects at full speed. Not looking it up right now but I think there was a case in Florida where a man died when his Tesla hit a highway barrier that was part of a lane closure for road construction.

This happens if you fall asleep or are distracted using any cruise control.

Yeah but with regular cruise control you are much more likely to crash or go off the road long before you reach a barrier or emergency vehicle in the road. It is just human nature the more automated the drive is and the less the driver is doing the more their attention will wander. The question is can Tesla create a software patch to teach the cars not to do this without causing some other unintended consequence, I could see correcting this could cause the car to do weird things when obstructions are near the road. Slamming on the brakes going under a bridge isn't safe either. Maybe they could create a heads up alarm to alert the driver in situations like this.

At the end of the day its the human drivers fault, but an autodrive system that is almost perfect is going to have the flaw that the human overseer gets easily distracted.

NobleHunter

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #12 on: August 18, 2021, 01:15:55 PM »
Given that Tesla appears to rely on cameras using machine learning for pattern recognition (if I understood the article correctly), it seems likely they'll continue to see crashes as the real world identifies events that are inadequately represented in their source data. Apparently flipped or rolled over cars don't get seen properly.

There's also the question of what I've seen called procedural drift. If you ask operators to do something to mitigate risk and 99% of the time that action is unnecessary, then the likelihood of the operators not doing it goes up. Then your mitigation of "drivers will continue to pay attention while using autopilot" doesn't really reduce the probability of a hazard occurring, since the odds are they won't be paying attention when the car fails to recognize the side of a bus as a solid obstacle.

LetterRip

  • Members
    • View Profile
Re: Tesla crashes and politics
« Reply #13 on: August 18, 2021, 01:41:39 PM »
Given that Tesla appears to rely on cameras using machine learning for pattern recognition (if I understood the article correctly)

They do, but importantly only the recent vehicles use that - these crashes are from the entire history of Tesla.  Tesla is eliminating the radar and switching to pure vision.

Quote
it seems likely they'll continue to see crashes as the real world identifies events that are inadequately represented in their source data. Apparently flipped or rolled over cars don't get seen properly.

You don't have to use real world data, simulated data would be fairly straight forward to do to ensure that you have emergency vehicles highly represented if that were an issue.  I'm unware of any of these accidents being with the FSD-Beta, the accidents I've heard of have been the first and second generation TACC which were purely radar based.

Quote
There's also the question of what I've seen called procedural drift. If you ask operators to do something to mitigate risk and 99% of the time that action is unnecessary, then the likelihood of the operators not doing it goes up. Then your mitigation of "drivers will continue to pay attention while using autopilot" doesn't really reduce the probability of a hazard occurring, since the odds are they won't be paying attention when the car fails to recognize the side of a bus as a solid obstacle.

To be clear - as far as I'm aware none of these incidents involved a failure of object recognition by the FSD network.  These have all been TACC based on using radar - which is what almost every car that has TACC does - and the algorithm is incapable of dealing with any non moving object - any stationary object is filtered out by the algorithm.

Tesla is eliminating the radar and in current FSD for all new cars uses purely cameras and thus will be based on object recognition.