Elon Musk: Tesla Drivers Can Text, But Should They?

▼ Summary
– Elon Musk confirmed Tesla’s latest Full Self-Driving (FSD) software update would allow texting while driving in certain traffic contexts, despite it being illegal and unsafe.
– The author strongly warns drivers not to text while driving, as they, not Tesla or Musk, will be legally and financially liable for any crashes.
– Tesla’s current FSD is a Level 2 supervised system requiring driver attention, with in-cabin cameras that monitor and alert drivers if they look away.
– Musk has long promised an unsupervised FSD version and framed enabling phone use as a “killer app,” but the legal prohibition on texting while driving remains unchanged.
– Unlike companies like Waymo, Tesla does not accept liability for incidents under its driver-assist systems, as it acknowledges the vehicles are not fully autonomous.
The idea that a car’s software could permit texting while driving raises serious safety and legal questions, despite any claims from its manufacturer. The stark reality is that texting behind the wheel remains illegal across nearly the entire United States, and no software update can change that fundamental law. Drivers must understand that they, not the automaker, bear full legal and moral responsibility for their vehicle’s operation, regardless of any automated features.
A recent announcement from Tesla’s CEO has brought this issue into sharp focus. He confirmed that an upcoming version of the company’s Full Self-Driving software might allow for texting in specific traffic conditions. This statement follows months of promises about an “unsupervised” driving mode, a significant shift from the current system’s requirements. Presently, Tesla’s FSD is classified as a Level 2 driver-assistance system. This means the human driver must constantly supervise the operation, keeping their eyes on the road and hands ready to take control. The vehicles use in-cabin cameras to monitor driver attention, issuing alerts and potentially disabling the feature if the driver looks away repeatedly, a function many owners have complained about.
The suggestion that these safety prompts could be relaxed, even in stop-and-go traffic, directly conflicts with traffic safety laws. Imagine being pulled over by law enforcement for using your phone while driving; telling an officer that the car’s CEO said it was permissible would be a futile defense. In the event of a collision, the legal liability would fall squarely on the driver, not the company. Tesla has consistently defended this position in court, arguing that its systems require active driver supervision. This stands in contrast to companies operating truly driverless vehicles, which assume liability for their robotaxis.
The allure of advanced technology is powerful, and many drivers are impressed by the capabilities of driver-assistance features. However, it is a dangerous mistake to conflate these aids with full autonomy. The critical distinction is that current Tesla systems are not self-driving cars; they are tools that require constant human oversight. The ultimate “killer app” for any vehicle should be arriving safely at your destination, not the ability to compose a text message while in motion. No software update, however sophisticated, removes the driver’s fundamental duty to operate their vehicle safely and within the law. The road demands your attention, and no billionaire’s tweet is worth risking lives over.
(Source: The Verge)





