AI/Machine Learning, Consumer Expectations, and Liability
As a Tesla owner, I purchased a Model S two years ago due to its promises of cutting-edge self-drive capability and the fact that I’d never have to buy gasoline again.
It’s been two years, and I’m still waiting to reap the benefits.
My car and I have been through a couple of accidents—the last one while cruising north on I-55 one night between Jackson, MS, and Memphis, TN, with my daughter in late July. The car was on autopilot, and we were getting low on charge. I was glancing between the road and the center screen for details (time/distance) to our next charge stop in Grenada, MS.
The highway is two-lanes in each direction, and it was completely free from any other traffic. There was a short bridge coming up, so I expected a little bump as we crossed the interface. The bump we got was SO much more than expected; what we encountered on the highway actually caused a bit of lift on the car—not exactly “Dukes of Hazzard” but alarming nonetheless. The best we’ve been able to guess is that we hit some sort of an animal that was low to the road . . . a roaming raccoon . . . or an unfortunate armadillo maybe.
Ultimately, this little varmint caused damage to the battery intercooler and resulted in thousands of dollars in repairs surrounding the impact zone in the center, front of the car.
Tesla Roadside Assistance was of little help that night, which is why I’m a fan of AAA.
Long story short—6 weeks and $6K later—I was able to fly back to Memphis and pick up my car. Oh joy.
So what about this autopilot stuff? This is why people buy these cars, right? We’re expecting some “Wow!” from the software, sensors, and associated hardware.
As a consumer, I feel that Tesla doesn’t stand behind their product very well. First of all, for the price of a Tesla, Roadside Assistance should be a stellar experience. “Thank you for being a customer. Are you okay? Are you in a safe place? How may we help?” Customers using their technology want to know Tesla has some skin in the game if there’s an accident.
People driving Teslas aren’t necessarily misusing the technology; we’re simply expecting the technology to perform as advertised. In the time I’ve owned one, I’ve noticed it doesn’t always see soft targets. Damn frightening. A human leveraging autopilot should yield an augmented driving experience. Reasonable safety by either approach—even better together.
The gaps I see in the Tesla software are disturbing though.
Now, with version 10.0, Tesla has released Smart Summon:
With Smart Summon, customers who have purchased Full Self-Driving Capability or Enhanced Autopilot can enable their cars to navigate a parking lot and come to them or their destination of choice, as long as the car is within their line of sight. It’s the perfect feature to use if you have an overflowing shopping cart, are dealing with a fussy child, or simply don’t want to walk to your car through the rain. Customers who have had early access to Smart Summon have told us that it adds convenience to their trips and provides them with a unique moment of delight when their car picks them up to begin their journey. Those using Smart Summon must remain responsible for the car and monitor it and its surroundings at all times.
The Tesla hype
SmartSummon experiences as reported by Today on/around Oct 2nd
One YouTube Reviewers experiences based on experimentation
Last year the self-driving Uber accident in Tempe AZ
In addition, in my experiences so far, autopilot does not pay attention to stoplights. If you’re the lead car approaching a light, do not anticipate the car braking for you. Scary.
. . . another fact, more annoying than scary—but still, it speaks to a lack of attention to detail and a lack of customer care. Tesla sent me an email about purchasing the self-drive option a few months ago. Funny, I purchased that option with the vehicle—it’s on my invoice! For a software company, wouldn’t you think this vehicle information would be in a customer contact database somewhere? “Send solicitations” vs. “Do not send solicitations” . . . Meh, just a thought.
Autopilot features, especially the shortcomings, aren’t well documented. Perhaps there should be a special endorsement required and maintained for licensed drivers to be able to operate a self-driving vehicle? Again, just a thought. This is an emerging landscape.
We’re still years off, at least a decade in my opinion, for this technology to become practical and trustworthy. I wonder if my Tesla Model S and it’s supporting hardware/sensors will support that software when it’s released?!