Consumer Reports said on Tuesday that Tesla Inc’s “Full Self-Driving” software lacks safeguards and raised concerns the system’s use on public roads puts the public at risk citing reports from drivers.
The influential consumer publication cited videos posted on social media of drivers using it and raised concerns about issues, including “vehicles missing turns, scraping against bushes, and heading toward parked cars.”
Consumer Reports said it plans to independently test the software update known as FSD Beta 9, as soon as its Model Y receives the update.
Tesla and the National Highway Traffic Safety Administration (NHTSA) did not immediately comment.
“Videos of FSD Beta 9 in action don’t show a system that makes driving safer or even less stressful,” says Jake Fisher, senior director of Consumer Reports’ Auto Test Center.
“Consumers are simply paying to be test engineers for developing technology without adequate safety protection.”
In April, Consumer Reports said its engineers were able to defeat safeguards on Tesla’s Autopilot and get out of the driver’s seat.
Last month, NHTSA disclosed it has opened 30 investigations into Tesla crashes involving 10 deaths since 2016 where advanced driver assistance systems were suspected of use.
Autopilot, which handles some driving tasks, was operating in at least three Tesla vehicles involved in fatal US crashes since 2016, the National Transportation Safety Board (NTSB) has said.
The NTSB has criticized Tesla’s lack of system safeguards for Autopilot, which allows drivers to keep their hands off the wheel for extended periods.
Tesla said last week eligible owners can subscribe to FSD for $99 or $199 a month. Tesla says FSD’s “currently enabled features do not make the vehicle autonomous. The currently enabled features require a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.”