One of the questions that always interested me regarding
self-driving vehicles is how the insurance bit would play out. Apparently, in
the trial for Uber self driving cars in Pittsburgh, passengers are likely
unknowingly, waiving any rights to compensation. So the answer to my question
is: you are on your own. And I personally thing that this is a gap insurance
companies should fill.
Insurance for cars today is simple, simple enough that you
can get a quote online based on a few questions. These questions are based
around the factors that are thought to be important in determining your risk of
getting into an accident, and the premium you pay depends on that risk.
Makes sense, right? People with higher risk pay more.
The science behind the premium you pay for your insurance is
well established. A quick look at the questions that help gauge your premium
shows the main drivers of risk to be demographics, past driving history,
experience and track record, vehicle characteristics:
The question is: how do you estimate the risk of a
self-driving car? Who is responsible in case of accident, the vehicle
manufacturer, the sensor manufacturer, the company that provides the algorithms
that translate sensor data to how the car behaves? Having a self-driving car
from a ride-sharing company adds an extra layer of complexity.
Tesla has taken a different route insisting that, even
though you might want to watch Harry Potter in your model S, you shouldn’t: (https://www.theguardian.com/technology/2016/jul/01/tesla-driver-killed-autopilot-self-driving-car-harry-potter).
The car makers did say that the autopilot that was likely
engaged at the time of the crash is not a substitute for drivers paying
attention, but more of a driving aid.
In fact, in May Tesla ordered updates https://www.theguardian.com/technology/2016/sep/11/tesla-self-driving-cars-autopilot-update
because it was felt drivers were feeling too confident and not paying attention
enough despite the crash in May. https://www.theguardian.com/technology/2016/jun/30/tesla-autopilot-death-self-driving-car-elon-musk
Other companies like Uber who do not produce the cars, seem
to take a different approach.
Uber’s cars in Pittsburgh (just like nuTonomy’s in Singapore
(http://www.theonlinecitizen.com/2016/08/25/singapore-to-have-the-worlds-first-self-driving-taxi-service-starting-from-one-north/))
have a driver who can take back control if need be. (http://www.bloomberg.com/news/features/2016-08-18/uber-s-first-self-driving-fleet-arrives-in-pittsburgh-this-month-is06r7on).
Does that make the driver responsible?
In fact, Grab has just signed a collaboration with nuTonomy
in Singapore (http://news.asiaone.com/news/singapore/uber-rival-grab-partners-driverless-car-firm-nutonomy-singapore#cxrecs_s).
Apparently, outside the limits of the One North, the driver will be
responsible.
I am not sure about the case in Singapore, but according to
the Guardian, users of the Uber self-driving service in Pittsburgh have waived
their rights to compensation. The driver is not responsible, and hence you
cannot be covered under commercial usage terms. This might be something
potential clients of the Grab service in Singapore might want to enquire about.
In any case, theoretically, how could the premium be
calculated?
Machines are trained based on data. Data is obtained from
recording real life experiences – let’s ignore simulations for the moment,
since simulations are just another layer of the same thing. For example, I am
quite sure nuTonomy will be capturing data from the Grab vehicles that will be
plying the Singapore roads outside One North, hence under driver’s control.
It might be a good idea to make all the training data available
to insurers so they can understand whose behaviour the machine learnt from.
Also oversampling should be made clear. Then the insurers might have an idea of
which type of profile the machine would be equivalent to.
Transparency is key.
In a few years, you might be cursing at the self-driving
vehicle you had a close shave with “You drive like a 55 year old ah pek!” in Singapore
or “You drive like a college educated 40 years old married white man!” in the
USA
Post Script:
I know the availability of training data is a very basic
first step; there are more complications, since sensors might fail
or be pushed too far:
or the machine still have more to learn:
and be careful, humans learn too.