1. High-severity accidents might drop, but the industry bleeds money on high-frequency, low-speed incidents (parking lots, neighborhood scrapes). Autonomy has diminishing returns here; it doesn't magically prevent the chaos of mixed-use environments.
2. Insurance is a capital management game. We’ll likely see a tech company try this, fail to cover a catastrophic liability due to lack of reserves, and trigger a massive backlash.
It reminds me of early internet optimism: we thought connectivity would make truth impossible to hide. Instead, we got the opposite. Tech rarely solves complex markets linearly.
> Insurance is a capital management game. We’ll likely see a tech company try this, fail to cover a catastrophic liability due to lack of reserves, and trigger a massive backlash.
Google, AFAIK the only company with cars that are actually autonomous, has US$98 Billion in cash.
It'd have to be a hell of an accident to put a dent in that.
All unlimited liability insurance companies (e.g. motor insurers in the UK) have reinsurance to take the hit on claims over a certain level - e.g. 100k, 1m etc.
For extreme black swan risks, this is how you prevent the insurance company just going bankrupt.
Reinsurers themselves then also have their own reinsurance, and so on. The interesting thing is that you then have to keep track of the chain of reinsurers to make sure they don't turn out to be insuring themselves in a big loop. A "retrocession spiral" could take out many of the companies involved at the same time, e.g. the LMX spiral.
If it's cheaper for them to pay lawyers a few tens or hundreds of millions to bury any such case in court, in settlements, or putting the agitator through any of the myriad forms of living hell they can legally get away with, then they'll go that route.
You'd need an immensely rich or influential opponent to decide they wanted to march through hell in order to hold Google's feet to the fire. It'd have to be something deeply personal and they probably have things structured to limit any potential liability to a couple hundred million. They'll never be held to account for anything that goes seriously wrong.
> Auto insurers don't face a "catastrophic liability" bankrupting scenario like home insurers might in the case of a natural disaster or fire.
This changes with self-driving. Push a buggy update and potentially all the same model cars could crash on the same day.
This is not a threat model regular car insurers need to deal with since it'll never happen that all of their customers decide to drive drunk the same day, but that's effectively what a buggy software update would be like.
Far be it from me to tell automakers how to roll out software but I would expect them to have relatively slow and gradual rollouts, segmented by region and environment (e.g., Phoenix might be first while downtown London might be last).
I doubt autonomous car makers will offer this themselves. They'll either partner with existing insurers or try to build a separate insurance provider of their own which does this.
My guess, if this actually plays out, is that existing insurers will create a special autonomy product that will modify rates to reflect differences in risk from standard driving, and autonomy subscriptions will offer those in a bundle.
Bundling a real product with a financial institution is a time tested strategy.
Airlines with their credit cards are basically banks that happen to fly planes. Starbucks' mobile app is a bank that happens to sell coffee. Auto companies have long had financing arms; if anything, providing insurance on top of a lease is the natural extension of that.
> High-severity accidents might drop, but the industry bleeds money on high-frequency, low-speed incidents (parking lots, neighborhood scrapes). Autonomy has diminishing returns here; it doesn't magically prevent the chaos of mixed-use environments.
This seems like it can be solved with a deductible.
I think parent might be implying that a 10 mph collision can total a car just as effectively as a 100 mph collision. There might be more left of the occupants, but the car itself might be still a total loss from a cost-to-repair perspective
> Autonomy has diminishing returns here; it doesn't magically prevent the chaos of mixed-use environments.
It doesn't prevent chaos, but it does provide ubiquitous cameras. That will be used against people.
I'm ambivalent about that and mostly in a negative direction. On the one hand, I'd very much love to see people who cause accidents have their insurance go through the roof.
On the other hand, the insurance companies will force self-driving on everybody through massive insurance rate increases for manual driving. Given that we do not have protections against companies that can make you a Digital Non-Person with a click of a mouse, I have significant problems with that.
> I'd very much love to see people who cause accidents have their insurance go through the roof.
Life is hard and people make mistakes. Let the actuaries do their job, but causing an accident is not a moral failure, except in cases like drunk driving, where we have actual criminal liability already.
> the insurance companies will force self-driving on everybody through massive insurance rate increases for manual driving.
Why would manual driving be more expensive to insure in the future? The same risks exist today, at today's rates, but with the benefit that over time the other cars will get harder to hit, reducing the rate of accidents even for humans (kinda like herd immunity).
> Given that we do not have protections against companies that can make you a Digital Non-Person with a click of a mouse, I have significant problems with that.
I absolutely think this is going to be one of the greater social issues of the next generation.
>Why would manual driving be more expensive to insure in the future? The same risks exist today, at today's rates, but with the benefit that over time the other cars will get harder to hit, reducing the rate of accidents even for humans (kinda like herd immunity).
I think it will get cheaper because people who want to do risky things that detract from driving will self select to drive autonomous vehicles.
Interesting theory, I would have assumed the exact opposite. People who want to drive fast and take risks will select manual driving because they'll find the autonomous cars too boring.
It's a numbers game. Those people basically don't exist compared to cheapskates who want to drive old cars and people who crash cars driving distracted. It's gonna come down to how many people who want to text and drive or do other sketchy stuff want to make the jump to autonomous cars. Classic car insurance is already stupid cheap just because it implicitly excluded a bunch of risky demographics.
Yes, imagine you bought a Google self-driving car for $70,000, and one day their algorithm gets mad at you due to a glitch, and your Google account is locked, your car can no longer be unlocked, can't be sold, and your appeals are instantly rejected and you have no recourse. Just a typical day in Google's world.
1. High-severity accidents might drop, but the industry bleeds money on high-frequency, low-speed incidents (parking lots, neighborhood scrapes). Autonomy has diminishing returns here; it doesn't magically prevent the chaos of mixed-use environments.
2. Insurance is a capital management game. We’ll likely see a tech company try this, fail to cover a catastrophic liability due to lack of reserves, and trigger a massive backlash.
It reminds me of early internet optimism: we thought connectivity would make truth impossible to hide. Instead, we got the opposite. Tech rarely solves complex markets linearly.