More Q’s Than A’s
We still have a lot of questions to answer before autonomous vehicles can go mainstream: Who’s at fault if an AV has an accident? Should people need licenses to ride in a self-driving car? How should an AV decide between running over a dog or a cat?
On Wednesday, Bloomberg published a story focused on yet another question — how should AVs interact with law enforcement? — and the solution might involving ceding control of your car to cops.
The Bloomberg story notes the Dec. 2018 incident in which an intoxicated driver fell asleep behind the wheel of a Tesla with Autopilot engaged. The vehicle led police on a seven-minute chase down a freeway before officers were able to compel the Tesla to stop by essentially boxing it in.
This is the kind of problem AV manufactures and law enforcement want to avoid, and that could mean programming AVs to pull over as soon as they detect flashing police lights behind them, a protocol already adopted by Waymo.
Bloomberg even suggests that officers forced to exit their vehicles might be able to instruct other AVs to reroute away from an area “with a couple of taps on a handheld device.”
Letting law enforcement control a car presumably owned by a citizen seems like murky legal territory.
Even if legal, it would be easy to see how some people might be opposed to police being able to give instructions to their car — especially if the car is programmed to follow police orders over that of the driver and the driver isn’t doing anything illegal.
Some critics have also noted how hackers might be able to exploit any ability for police to control AVs.