Advertisement

Blog

Another Tesla Crash, What It Teaches Us

Earlier this week, I came across a report about a Tesla’s AutoPilot crash. It appeared on Tesla Motors Club’s site, posted by a Tesla fan planning to purchase a car.

The user’s post on the web site’s forum read:

I was on the last day of my 7-day deposit period. I was really excited about the car. So I took my friend to a local Tesla store and we went for a drive. AP [AutoPilot] was engaged. As we went up a hill, the car was NOT slowing down approaching a red light at 50 mph. The salesperson suggested that my friend not brake, letting the system do the work. It didn't. The car in front of us had come to a complete stop. The salesperson then said, “brake!” Full braking didn't stop the car in time and we rear-ended the car in front of us HARD. All airbags deployed. The car was totaled. I have heard from a number of AP owners that there are limitations to the system (of course) but, wow! The purpose of this post isn't to assign blame, but I mention this for the obvious reason that AP isn't autonomous and it makes sense to have new drivers use this system in very restricted circumstances before activating it in a busy urban area.

Thankfully, nobody got hurt. This post got no traction in the media. No reporter appears to be following it up (except for this publication). This could have been easily filed under the rubric, “minor accidents,” the sort of news we all ignore.

(Source: Tesla Motors Club)

(Source: Tesla Motors Club)

However, this accident, and moreso, subsequent discussions in the Tesla Motors Club forum, intrigued me for two reasons.

First, it’s a reminder that it ain’t easy to curb drivers’ appetite to “test the limits” of so-called AutoPilot, despite the carmaker’s stern warnings.

The key case in point is Tesla’s first fatal accident, which took place in Florida last May. After splurging on such a well-regarded, expensive automobile, who wouldn’t want to test its limits in driving and brag about it? The inevitable result is a clash between Tesla’s prudence and human nature.

Second, AutoPilot is still in the making. New technologies keep emerging, allowing the automaker to continue to improve it via software updates.

I was amazed to see so many posts by other Tesla users — all apparently very knowledgeable. They discussed the limitations of radar, problems AutoPilot handling hills, and the differences between software versions 7.1 and 8.0.

If this isn’t “inside baseball,” what is? I’d hate to think that an average driver needs to do this much homework to really understand why AutoPilot just doesn’t work in certain situations and why it isn’t autonomous.

But first thing first. I had to find out if the accident described to the Tesla Motors Club actually happened. I had no corroboration, and I’m still too fussy to believe every blog I see.

It took Tesla a few days, but the company finally got back to me Thursday, with the following statement.

“After speaking with our customer and the Tesla employee, we determined that this accident was the result of a miscommunication between the individuals inside the car.”

The accident happened near Los Angeles area. The vehicle that crashed on a test drive was running the software version 8.0.

As expressed in the statement above, Tesla stressed that this accident was not the result of the driver intentionally testing the limits of AutoPilot, but it happened because a miscommunication inside the car.

Tesla’s version 8 software, according to Tesla, offers “the most significant upgrade to Autopilot” among other things. It will use “more advanced signal processing to create a picture of the world using the onboard radar.” 

To be clear, the radar was already added to all Tesla vehicles in October 2014 as part of the Autopilot hardware suite, but it was only meant to be a supplementary sensor to the primary camera and image processing system, the company explained.

A the time of version 8.0 release, Tesla made it clear that it changed its mind. “We believe radar can be used as a primary control sensor without requiring the camera to confirm visual image recognition.”

To Tesla’s credit, the company isn’t dismissing the incident offhand. The company is painfully aware — since the Florida crash — of the intense scrutiny now focused on Tesla’s AutoPilot, compared with other carmakers’ similar Level 2 systems.

To read the rest of this article, visit EBN sister site EE Times.

2 comments on “Another Tesla Crash, What It Teaches Us

  1. mobbydi
    November 16, 2016

    there will always be accidents, and the greater the electrical machinery on the road the more accidents will.

    tesla is the best cars

  2. TylerD
    November 22, 2016

    Was this Model S configured with the updated hardware? 

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.