Consumer Reports Says Tesla Shouldn’t Test Beta Software on Public Roads

consumer-reports-says-tesla-shouldn’t-test-beta-software-on-public-roads

According to Jake Fisher, senior director of Consumer Report’s Auto Test Center, “consumers are simply paying to be test engineers for developing technology without adequate safety protection.” That’s a clear reference to Tesla offering a monthly subscription for FSD instead of charging owners $10,000 for the feature and taking it back if the person sells the car. FSD is linked neither to the vehicle nor to the customer – which brings interesting legal doubts about the nature of such a purchase.

The consumer protection organization stresses that FSD beta 9 involves more people than just Tesla customers. Tests on autonomous driving tech are normally performed on private tracks or with highly trained test drivers behind the steering wheel. The test cars also bring visual warnings about the tests, allowing others around to avoid them if they don’t want to take any chances. With FSB beta 9, cyclists, pedestrians, and other drivers would just have to avoid any Tesla vehicle around if they did not want to be part of the beta testing.

That’s precisely the sort of backlash that can hit Tesla if the company keeps trying to develop its autonomous drive tech with regular customers. After Tesla vehicles were said to have brake issues in China, some parking lots and shopping malls there decided to ban cars from the company to use their premises.

Consumer Reports does not mention how hurtful that can be for the brand, just for everyone else involved with these tests. Fisher said that “testing developing self-driving systems without adequate driver support can – and will – end in fatalities.”

None of the specialists that Consumer Reports talked to defended Tesla’s approach. Selika Josiah Talbott, a professor who studies autonomous vehicles at the American University School of Public Affairs in Washington, DC, said cars on FSD behave “almost like a drunk driver.”

Missy Cummings is an automation expert who is director of the Humans and Autonomy Laboratory at Duke University in Durham. She told Consumer Reports that “it’s a very Silicon Valley ethos to get your software 80 percent of the way there and then release it and then let your users figure out the problems.” For cell phones, that may be okay, but not “for a safety-critical system.”

Finally, Selika Josiah Talbott said that it is “as if the U.S. Department of Transportation has blinders on when it comes to the actions of this particular company.” In other words, she and other specialists don’t understand why Tesla can test autonomous driving tech with its customers without any obligations for how to do that or liabilities when things go wrong. Tesla even makes money with that approach: either from the people that accept to pay $10,000 right off the bat – risking losing that amount at any time – or from the recent subscribers.

The entire text is definitely worth a read and, more than that, a reflection. Anyone willing to test FSD beta 9 is not doing that alone on public roads. If Tesla takes no responsibility for that, perhaps these customers should.