Are We Ready for ‘Eyes-Off’ Driving?

▼ Summary
– GM is developing Level 3 “eyes-off driving” technology for some highways, allowing drivers to remove hands and eyes from driving tasks but hasn’t clarified responsibility when issues occur.
– Level 3 systems require drivers to remain ready to take control when prompted, with liability falling on them if they fail to respond quickly during incidents.
– Multiple automakers including GM, Ford, and Mercedes-Benz are pursuing Level 3 technology, but its use is currently restricted in most regions due to regulatory uncertainties.
– Legal precedents show human drivers are often held responsible in crashes involving automated systems, though automakers like Tesla have also faced partial liability in some cases.
– Experts express concerns about human drivers struggling to re-engage quickly after disengagement and the challenges automated systems face in unpredictable real-world environments.
The automotive industry is accelerating toward a future where drivers can legally take their eyes off the road, but critical questions about safety and legal responsibility remain unresolved. General Motors recently joined a growing roster of car manufacturers developing what’s known as “eyes-off” driving technology. Unlike the distracted driving many motorists already engage in, this represents a formal step toward fully autonomous private vehicles. GM’s existing Super Cruise system permits hands-free operation while monitoring the driver’s gaze to ensure attention stays on the road. The new system, classified as Level 3 automation, would go further, enabling drivers on certain U.S. highways to remove their hands from the wheel and their eyes from the road ahead.
GM plans to introduce its Level 3 technology by 2028, starting with the Cadillac Escalade IQ before expanding to Chevrolet, Buick, and GMC models. This advancement promises to let drivers use phones, watch videos, or play games without legal penalty, but only under specific conditions. The catch is that drivers must remain prepared to retake control immediately when the system requests it. Failing to respond in time could leave the driver liable if an incident occurs. Given the unpredictable nature of real-world driving, the potential for something to go wrong is constant.
Dr. Alexandra Mueller, a senior research scientist at the Insurance Institute for Highway Safety, observes that conditional automation introduces significant complications. She notes, “With Level 3 automation, things get messier. That’s where many concerns originate, because there’s a great deal we simply don’t understand about how these systems will perform.”
The list of automakers investing in Level 3 automation is expanding. Alongside GM, companies like Ford, Stellantis (parent of Jeep), and Honda are actively developing similar systems. Mercedes-Benz has already launched its Drive Pilot Level 3 system, though its use is currently restricted to designated highways in California and Nevada. This highlights a central challenge: automakers are pushing a technology that remains illegal across much of the country and around the world. Germany and Japan have granted limited, temporary approvals for BMW and Honda, but widespread adoption awaits regulatory clarity.
Regulators face a complex task in determining liability for systems that shift control between the vehicle and the human driver. Mercedes states that it will assume responsibility for crashes caused by its Drive Pilot while the system is active. However, this guarantee is conditional, the driver remains accountable if they ignore prompts to resume control or misuse the technology. Tesla’s Level 2 systems, Autopilot and Full Self-Driving, have already demonstrated the risks of this ambiguity. Federal investigators examining multiple Tesla crashes found that Autopilot often disengaged less than one second before impact. While no evidence suggested intentional evasion of responsibility, the pattern raises serious safety concerns.
Automakers point to the array of sensors, cameras, infrared trackers, torque sensors, as tools to clarify fault in the event of a crash. At the unveiling of GM’s eyes-off system, CEO Mary Barra emphasized that enhanced sensing capabilities would allow the company to reconstruct events with precision. She affirmed that General Motors would take responsibility where appropriate, but the fundamental contradiction of Level 3 remains: drivers are encouraged to disengage from driving yet must stay ready to re-engage instantly.
Planned transitions, like entering or leaving a mapped highway zone, may proceed smoothly. Unplanned events, sudden weather changes, road hazards, or erratic behavior from other vehicles, can overwhelm the system’s capabilities. Research indicates that humans perform poorly when abruptly pulled back into complex control tasks after periods of inactivity. A driver who has been disengaged may oversteer, brake excessively, or fail to react appropriately because they haven’t been monitoring the road. These errors can trigger dangerous chain reactions.
Dr. Mueller highlights the challenge of mixed traffic environments, stating, “The mixed fleet scenario, which will likely persist for decades, creates a highly uncontrolled setting where even advanced automated systems will struggle. They’ll continue to struggle because we operate in a chaotic, dynamic world where conditions are always shifting.”
Early legal cases are beginning to establish that human drivers bear significant responsibility even when automation is active. In Arizona, the safety driver of an Uber autonomous vehicle pled guilty to negligent homicide following a fatal 2017 crash. Similarly, a Tesla driver using Autopilot pled no contest to negligent homicide after a crash that killed two people. In both instances, prosecutors argued that the human operator was ultimately accountable for the vehicle’s actions.
Automakers may welcome these rulings, but other cases have found manufacturers partly liable. A Florida jury recently determined that Tesla shared responsibility for a crash involving a Model S on Autopilot that resulted in two fatalities. While the vehicle’s owner was also found liable, Tesla was ordered to pay $243 million in damages.
According to Mike Nelson, a trial attorney specializing in mobility law, legal standards for automation-related crashes are still in their infancy. Precedents from Level 2 cases will shape future rulings on Level 3 and higher systems. However, the general lack of technical understanding among judges, lawyers, and juries introduces a high degree of uncertainty. Nelson advises automakers to prioritize transparency as roads become shared by human drivers and automated systems. He explains that juries respond favorably to companies that acknowledge and address problems rather than conceal them.
“The current legal confusion isn’t surprising,” Nelson reflects. “We’ve seen this pattern with every major industrial revolution. Clarity will come, but we’re in for a turbulent transition.”
(Source: The Verge)



