Written by David Shepherdson
WASHINGTON (Reuters) – The National Highway Traffic Safety Administration has expressed concern about social media posts suggesting Tesla’s fully self-driving software could be used as robotaxis and do not require driver attention.
In October, NHTSA announced that 2.4 million Tesla vehicles equipped with FSD software would be subject to four crashes in 2023, including a fatal crash, in conditions such as sun glare, fog, and airborne dust. An investigation has begun.
In a May 14 email published Friday, NHTSA told Tesla that the company’s social media posts had warned people that FSD “requires sustained attention and intermittent intervention by the driver.” “This could encourage people to view robotaxis as “rather than partially automated/driver-assistance systems,” he said.
NHTSA is reposting stories of individuals who chose to use an FSD to drive 13 miles (21 km) from their home to the emergency room during a heart attack, and others who used an FSD to drive 50 minutes (21 km) from their home to the emergency room. He cited Tesla’s posts on X, including another post depicting him returning home. FSD for sporting events.
“We believe Tesla’s posts are inconsistent with:
“The message was clear: drivers need to maintain continuous control of dynamic driving tasks,” NHTSA wrote, asking Tesla to reconsider its communications.
When Tesla met with authorities in May about the social media posts, it told NHTSA that it told drivers in its instruction manual and elsewhere that the vehicles were not self-driving and that they needed to be constantly alert. He said he is doing so.
Tesla did not immediately comment Friday. Elon Musk is the CEO of Tesla and the owner of the social media site X, formerly known as Twitter.
NHTSA on Friday released a letter dated Monday to Tesla, warning that the driver assistance system’s “potential malfunctions, including detection and appropriate response in certain situations where roadway visibility may be limited.” We asked for answers to questions under investigation by December 18th, including: FSD has the ability to operate safely. ”
NHTSA added: “The investigation will consider the adequacy of the feedback and information the system provides to drivers to enable them to make real-time decisions when the system’s capabilities are exceeded.”
A 71-year-old woman in Rimrock, Arizona, who got out of her car after rear-ending two other cars and was struck by a Tesla in FSD mode while the driver was battling sun glare, died without being charged.
In December 2023, under pressure from NHTSA, Tesla agreed to recall more than 2 million vehicles in the United States to install new safety devices in its advanced driver assistance system Autopilot. is still considering the adequacy of safety equipment.
(Reporting by David Shepherdson; Editing by Diane Craft and Bill Berkrot)