Tech

Here's why I still don't trust Tesla Autopilot

Tesla's leap in AI has left competitors like Alphabet's Waymo, Uber, BMW and GM scrambling in vein to keep pace, and there is a long lineup of companies wanting to partner, including Toyota and Mercedes-Benz.
Mack Hogan | CNBC
Key Points
  • The system is easily the most competent and polished driver assistance system on the market.
  • Unfortunately, this competence means the car is often overconfident and keeps control even when it doesn't know what to do.
  • In order for users to trust it, Tesla needs to make the system less reluctant to ask for help.

seems a bit nervous about its Autopilot autonomous driving technology and now I know why.

When I first engaged Autosteer in the settings menu, I was greeted by a wall of legalese explaining the responsibilities I have as a driver while operating the system. In case that wasn't enough, Tesla also insisted that I first try Autopilot with a communications executive in my passenger seat ensuring I was ready to operate it safely and correctly ahead of my Model S P100D review.

You can't really blame Tesla.

The company has encountered serious backlash for the system. Some have blamed it for crashes, others have criticized the cavalier marketing strategy and allegedly misleading name. Elon Musk has also declared his intention to one day morph Autopilot into a fully-autonomous system, and he's already selling cars marketed as having the hardware necessary for full self-driving capabilities.

Typically, when I review cars with semi-autonomous capability, I add a few sentences under the "driving" section to critique the systems. But as you can see, there's far too much to unpack with Autopilot. I decided to do a separate article, focusing on what Autopilot does and what it fails to do.

Capabilities: It's all in the name

A Tesla Autopilot sensor
Mack Hogan | CNBC

First off, let's nail down the raw capabilities of Autopilot. The system groups traffic-aware cruise control — or adaptive cruise control, radar cruise control, whatever you may call it — with a technology still in beta, called Autosteer. That means the entirety of Autopilot is technically a beta product, which is an important if often-overlooked disclaimer.

Traffic-aware cruise employs radar and ultrasonic sensors to detect other motorists. Autosteer, meanwhile, uses stereoscopic cameras to read lane markings. All of this is stitched together by the car's computers to map out what's going on and where the car should go. Throttle, brakes and steering are applied automatically to ensure that the car stays safely centered in its lane and maintains a reasonable following distance from the vehicle in front.

On multi-lane highways, the system can also execute lane changes, without the driver having to turn the wheel, if the adjacent lane is clear and the driver activates his turn signal.

All together, Tesla's Autopilot then functions more or less like traditional airliner autopilot. The car won't change lanes by itself or swerve to avoid obstacles, but will simply maintain course.

Tesla's system is at no time responsible for the vehicle. The person behind the wheel still has full legal responsibility to closely monitor the situation and take control should anything unforeseen occur. Don't expect to be taking any naps.

Limitations: Use with caution

An Autopilot sensor on the Model S P100D
Mack Hogan | CNBC

Autopilot should only be used on divided highways, as it isn't yet capable of responding to perpendicular traffic. Responding to cross traffic requires a lot more decision making than Tesla may want to take responsibility for.

It's also only designed for usage in areas where lanes are clearly marked. The company warns against construction zones, especially after a viral video showed a Model S on autopilot slamming into a wall due to unclear lane markings in a work area.

And because it's worth repeating: Autopilot is driver assistance technology, not driverless technology. Vigilant and constant supervision is required.

Performance: Or, how that plays out in the real world

Using Autopilot during a drive to Detroit
Mack Hogan | CNBC

First, an objective measure. On a ride from Columbus to Detroit, I found 35 miles of construction-free interstate — a damned near impossible feat out here in the Midwest — to try out the system. I recorded how often the vehicle told me to put my hands back on the wheel and how many mistakes it made.

At a speed of 70 miles per hour, the test took 30 minutes to complete. During that time, the vehicle asked me to put my hands on the steering wheel 19 times, or about once every minute and a half. In that thirty minutes, the vehicle made a grand total of zero mistakes.

At one point, I was cut off by a Highlander and the vehicle quickly and smoothly responded. Many adaptive cruise control systems panic and brake far too aggressively, potentially causing a rear-end collision; not the Model S, though.

At one point, the car did lose sight of the left lane marking. Instead of disengaging, the car simply clung to the right lane marking like a barnacle to a cargo ship. Losing one marking and clinging to the other sounds like a good idea, but it's emblematic of a larger issue I had with Autopilot over hundreds of miles with the car.

The Issue: Autopilot is overconfident

Sitting in the front seat of the Model S P100D
Mack Hogan | CNBC

Autopilot is overconfident. Had this been a fully-autonomous car, a best-guess approximation of the lane based on one marking would be the right decision. But it's not fully autonomous, and it needs to stop pretending like it is.

See, a semi-autonomous system needs to be quicker to call the driver back into direct control. Waiting until both lane markings disappear is probably too late, so when you lose a marking, the car needs to tell the driver that Autopilot is out of its depth. That's what I've experienced in Volvos, BMWs and Lexuses. But time and time again, the Tesla seemed more concerned with looking like it knew what it was doing than keeping me safe.

I continued to test the system on my ride back from Detroit. I went to test the auto-lane change feature, and as the car moved into the lane I noticed a black Tahoe coming up fast. I jerked the car back into the original lane of travel and avoided the incident, but the car never seemed to register it was moving itself into the path of a speeding, monstrous SUV.

Obviously, the car wasn't getting a lot of good information from it's rear-facing sensors. And it's not hard to see why — Tesla uses ultrasonic sensors to detect vehicles in its blind spot, rather than the more ubiquitous radar. That means limited range and limited visibility.

If the car can't see a Tahoe careening down on its keester, is it really fair to say it has all the hardware necessary for self driving? Maybe once Tesla turns on more of the auxiliary cameras the car will get a better view of what's going on, but for now I'm skeptical.

It's worth noting here that Tesla officially calls the lane change feature an advanced driver assistance system, and the responsibility to check for approaching vehicles is on the driver. Moreover, Tesla says the vehicle's constant reminders to place your hands on the wheel emphasize that the car's skills are supposed to add to the safety of your driving, not ever fully control the vehicle.

Conclusion: Trust isn't negotiable

Pulled over for a shot of the Tesla Model S P100D
Mack Hogan | CNBC

That, though, is the crux of it all. Skepticism, doubt, fear; these are the things that can kill any attempts at winning over the public and getting them into autonomous cars. Anyone who's spent 10 minutes on research can tell you autonomous cars will eventually be better drivers than humans are.

But the honest truth is that right now, the commercially-available systems still have a lot to learn. Tesla's car is flummoxed by bumps, sometimes can't see lane markings and truly cannot be safely used outside of major highways. That's not a bad thing, it's technology in its infancy.

The bad part is that the car refuses to acknowledge its shortcomings. Even when it clearly should, it seems reluctant to tell the driver, "I can't handle this! Snap out of it and start driving!" Because of this, I never trusted Tesla Autopilot. And I don't think you should, either.