(Reuters) — A U.S. senator on Friday urged Tesla to rebrand its driver help system Autopilot, saying it has “an inherently deceptive title” and is topic to probably harmful misuse.
However Tesla stated in a letter that it had taken steps to make sure driver engagement with the system and improve its security options.
The electrical automaker launched new warnings for pink lights and cease indicators final 12 months “to attenuate the potential danger of pink light- or cease sign-running because of short-term driver inattention,” Tesla stated within the letter.
Senator Edward Markey stated he believed the potential risks of Autopilot will be overcome. However he referred to as for “rebranding and remarketing the system to scale back misuse, in addition to constructing backup driver monitoring instruments that may be certain that nobody falls asleep on the wheel.”
Markey’s feedback got here in a press launch, with a duplicate of a Dec. 20 from Tesla addressing among the Democratic senator’s issues hooked up.
Autopilot has been engaged in not less than three Tesla autos concerned in deadly U.S. crashes since 2016.
Crashes involving Autopilot have raised questions in regards to the driver-assistance system’s capacity to detect hazards, particularly stationary objects.
There are mounting security issues globally about programs that may carry out driving duties for prolonged stretches of time with little or no human intervention, however which can not fully exchange human drivers.
Markey cited movies of Tesla drivers who appeared to go to sleep behind the wheel whereas utilizing Autopilot, and others during which drivers stated they might defeat safeguards by sticking a banana or water bottle within the steering wheel to make it seem they had been in command of the automobile.
Tesla, in its letter, stated its revisions to steering wheel monitoring meant that in most conditions “a limp hand on the wheel from a sleepy driver won’t work, nor will the coarse hand stress of an individual with impaired motor controls, corresponding to a drunk driver.”
It added that gadgets “marketed to trick Autopilot, could possibly trick the system for a short while, however usually not for a complete journey earlier than Autopilot disengages.”
Tesla additionally wrote that whereas movies like these cited by Markey confirmed “just a few unhealthy actors who’re grossly abusing Autopilot” they represented solely “a really small proportion of our buyer base.”
Earlier this month, the U.S. Nationwide Freeway Site visitors Security Administration (NHTSA) stated it was launching an investigation right into a 14th crash involving Tesla during which it suspects Autopilot or different superior driver help system was in use.
NHTSA is probing a Dec. 29 deadly crash of a Mannequin S Tesla in Gardena, California. In that incident, the automobile exited the 91 Freeway, ran a pink gentle and struck a 2006 Honda Civic, killing its two occupants.
The Nationwide Transportation Security Board will maintain a Feb. 25 listening to to find out the possible explanation for a 2018 deadly Tesla Autopilot crash in Mountain View, California.
(Reporting by David Shepardson; Enhancing by Chizu Nomiyama and Tom Brown)