Consumer Reports slams Tesla’s Autopilot

Hackers crack Tesla system, win Model 3
Image: Tesla

On Tuesday, Consumer Reports released a scathing assessment of the latest iteration of Autopilot, Tesla’s highly touted self-driving program. The nonprofit organization notes a recent upgrade to the self-driving application has made it “far less competent” than a human driver. Indeed, the firm’s report indicates the electric car maker’s latest firmware update is a significant step backward in its journey to realizing full automotive autonomy.

Risky Maneuvers

Late last year, Tesla released a new feature for its Autopilot program called Navigate. The corporation intended the application to give its customers a more seamless driving experience by offering lane change suggestions. However, in recent weeks the firm released an update that allows Navigate to execute lane changes without operator control on Model 3 sedans.

Consumer Reports decided to test the functionality of the company’s new autonomous driving feature. The organization’s findings were not encouraging. Testers found Navigate made the driving experience harder by initiating risky and sometimes illegal lane changes.

Navigate repeatedly cut off other cars without leaving adequate space to ensure driver safety. Even worse, the program frequently executed lane changes and sudden braking in front of fast-moving vehicles. Without operator intervention, Tesla’s Navigate-enabled cars might’ve caused several collisions.

Ars Technica theorized the application’s shortcomings are due to gaps in its rear-facing data inputs. While the Model 3 has front facing cameras and radar to help it navigate, it lacks rearward radar. Consequently, the vehicle might have a difficult time judging the speed of cars driving behind it when executing lane changes.

Admittedly, Tesla has always maintained Autopilot needs to be monitored by a human operator when in use. However, the manufacturer has also marketed its autonomous feature as “Full Self-Driving Capability,” which it plainly is not.

To date, Tesla’s Autopilot program has been involved in three driver fatalities.

Autonomous Traffic Violations

The new self-driving feature also attempted road maneuvers that would’ve caused drivers to receive tickets had they been completed. Consumer Reports performed its tests in Connecticut where passing another car from the right on a two-lane highway is prohibited. However, Navigate initiated the maneuver regardless of local traffic regulations. The firm’s testers also noted the feature did not return a car to the right-hand lane after passing another vehicle.

Navigate’s failure to observe local driving laws suggests an under-discussed obstacle in implementing full self-driving technology. While the United States has specific national regulations regarding the operation of motor vehicles on freeways, states, cities, and towns have different traffic laws.

Tesla’s autonomous software will need to comply with all those different directives to gain countrywide approval. As there are thousands of different traffic laws in the United States, the company’s cars will need a powerful artificial intelligence solution to make Autopilot work.

Consumer Reports asked an ABI Research automotive analyst Shiv Patel about the overall functionality of Autopilot. Patel told the organization Tesla currently offers the best in class autonomous transportation program. However, the expert also said the company’s current hardware isn’t at the level where it can affect full self-driving.

If Patel’s assessment is correct, Tesla will need to improve the quality of its chipsets rapidly. Last month, the firm’s CEO Elon Musk announced his company would roll out one  million autonomous taxis in 2020.