On Sunday night, a Tesla crashed in North Brunswick, New Jersey, reportedly veering off the road and hitting several signs before finally stopping. The driver wasn’t hurt, but the vehicle did sustain extensive damage.
It wasn’t a particularly remarkable accident, other than the fact that the driver claims he could have prevented it — if Autopilot would have let him.
According to a police report viewed by NJ.com, the “vehicle could have gone straight or taken the Adams Lane exit, but instead split the difference and went down the middle, taking the vehicle off the roadway and striking several objects at the roadside.”
The report also notes that the driver claimed the car’s Autopilot system “got confused due to the lane markings” and “that he tried to regain control of the vehicle, however it would not let him.”
Tesla doesn’t appear to buy the driver’s excuse, with a spokesperson telling Elektrek in a statement, “Since we launched Autopilot in 2015, we are not aware of a single instance in which Autopilot refused to disengage.”
The spokesperson also explained how Tesla designed the system to prevent the type of scenario described by the driver.
“A driver can easily override Autopilot by lightly touching the steering wheel or brakes,” they said in the statement. “Moreover, the brakes have an independent bypass circuit that cuts power to the motor no matter what the Autopilot computer requests. And the steering wheel has enough leverage for a person to overpower the electric steering assist at all times.”
Based on the available information, it appears more likely that this was a case of driver error than Autopilot refusing to relinquish the wheel. However, it could be a sign that as self-driving cars become more common, we could see more and more drivers blaming their vehicles for their own mistakes.