Regulating the Future: Is Austin ready for robot cars? Does that even matter?
Robo-cars. Driverless vehicles. Unmanned cars. Autonomous vehicles. Call them what you want—but they are here in Austin, and they are here to stay.
Though public attention has only recently increased, autonomous vehicles (“AVs”) have actually been legal in Texas since our legislature unanimously passed Senate Bill 2205 with a 31-0 vote in 2017. The bill’s key provision prioritized boosting the AV industry and speeding its development in Texas by barring cities from getting involved. Thus, in September 2017, our legislature made its intent clear—whether Texas cities are ready for this technology is less important than promoting the technology. So, what do we know? And what, if anything, can we do to protect ourselves?
Simply put, cities in Texas cannot regulate AVs. Meaning, then, that rules for AVs are uniform across the State, rather than local municipalities having the ability to create their own regulations. To drive itself around Austin (or any other Texas city), an AV must be (1) capable of operating in compliance with Texas traffic- and motor-vehicle laws, (2) equipped with a data-recording system, (3) able to immediately stop after a crash and notify the proper authorities of the incident, and (4) contain an automated driving system that complies with applicable federal law and federal motor vehicle safety standards. When an AV is uncertain of the safest way to proceed, it will pull over and turn on its hazard lights. And General Motors’ (“GM”) Cruise AVs, for example, contain Remote Assistance advisors who monitor and occasionally assist vehicles in incidents like this. “Within our dedicated team who continuously monitor and assist our driverless fleet, our Remote Assistance (RA) advisors are available in instances when the AV needs help navigating a situation. RA advisors have access to live data from the vehicle and by connecting remotely to the vehicle, they can suggest a pathway that allows the vehicle to proceed. We’re working to minimize how often this happens, but it is and will remain one aspect of our overall safety operations,” GM explained in a company statement.
While AVs must be inspected and licensed, the procedures are different from those for regular cars. According to the Texas Department of Public Safety, “if a vehicle is truly automated a licensed human operator is not required to be in the vehicle and the vehicle itself is considered to be licensed to drive. In such situation, the owner of the vehicle is considered to be the operator for the purposes of assessing compliance with traffic laws.” But Texas also requires all drivers to carry a minimum-liability insurance policy on their vehicles, in case of an at-fault accident. So, how do we assess liability for an accident caused by an AV?
The answer to this question is an evolving legal challenge, and can vary depending on the AV’s automation level. The Society of Automotive Engineers (“SAE”) has defined levels of automation for self-driving cars, ranging from Level 0 (no automation) to Level 5 (full automation).
Level 1 – Driver Assistance
This level involves some automation, but the human driver must remain engaged and monitor the environment. For example, a lane-keeping system that helps steer a vehicle to the center of a lane.
Level 2 – Partial Automation
At this level, the vehicle can control both steering and acceleration/deceleration simultaneously. However, a human driver must still monitor the environment and be prepared to take over at any moment. Systems such as Tesla’s Full Self-Driving and GM’s Super Cruise are considered Level 2.
Level 3 – Conditional Automation
The vehicle can handle all aspects of driving in certain conditions and environments. The human driver is still required, but does not need to monitor the environment constantly. However, the human must be available to take over when the system requests. Honda reportedly has released the first Level 3 AV, but it is only available to be leased in Japan.
Level 4 – High Automation
The vehicle can operate without human input or oversight in specific environments or conditions. However, it may not be capable of handling all possible situations. In Level 4, human intervention is not a requirement. Level 4 driverless rideshare vehicles are in limited testing but not approved for general use anywhere in the U.S.
Level 5 – Full Automation
This level represents complete automation. The vehicle is capable of performing all driving tasks under all conditions that a human driver could navigate. Level 5 vehicles do not require a human driver at all. Fully self-driving systems are still theoretical and are predicted to not be available until after 2035.
Thus, in instances where the AV qualifies as either Level 0 or Level 1 automation, the human driver is typically primarily responsible. But as automation levels increase, so does the potential shift in liability towards the manufacturers, software developers, or other entities responsible for the vehicle’s autonomous systems. The Texas Transportation Code states that the owner “of the automated driving system is considered the operator of the automated motor vehicle solely for the purpose of assessing compliance with applicable traffic or motor vehicle laws, regardless of whether the person is physically present in the vehicle while the vehicle is operating.”
Notwithstanding this information, most questions regarding AV-accident liability remain unanswered. At Levels 0 and 1, what happens when the vehicle was in control—not the insured driver—at the time of the crash? Because insurance companies cannot rely on a driver’s statement from the AV, companies might have to analyze information provided by the “black boxes,” or electronic-control modules, in the AV. As a result, the AV’s manufacturer might have to prove that its vehicle didn’t cause the crash, rather than requiring the other vehicle’s driver to prove that the AV was responsible.
So, could an AV crash theoretically constitute a product-liability case—focusing on the AV’s design or manufacture—instead of driver negligence? Can AVs implicate their manufacturer more than their driver or operator? And, what if the manufacturer is GM, but Google produced the autonomous technology? Though all these questions, and an infinite amount more, remain unanswered, circulating around in conference rooms and legal debates, Texas still decided to not only allow AVs on its streets, but actually accelerated the technology’s development by barring city-specific regulation. The next obvious question, then, is why would the State do this to its citizens?
According to the National Highway Traffic Safety Administration, human-driver error causes 94% of all vehicle collisions. And the Texas SB 1308 Final Report, submitted on January 1, 2023, to the Texas legislature by TxDOT and DPS, in consultation with the Texas A&M Transportation Institute and the appropriate federal agencies, studied AVs’ potential benefits and impacts on transportation needs, including the impact of AVs on driver and public safety. The report noted that “the number of people per mile who lose their life in traffic crashes in Texas is higher than the number of deaths per mile for fatalities nationwide” and that, based on its study, “technology advancements [such as AVs] have the potential to address key public safety needs.”
Indeed, the Texas legislature’s belief is that AVs can improve the number of human-driver errors substantially, stating that AVs could reduce traffic congestion and reduce crash injuries and fatalities—particularly those caused by drunk driving or speeding. As the now-Texan Elon Musk remarked back in 2015, human-driven cars may actually be outlawed when AVs become safer. Further, AVs could also provide new mobility options for disabled and elderly Texans who currently depend on others to drive. As Paul Hemmersbaugh, GM’s chief counsel and public policy director for transportation as a service, said in a statement: “Self-driving vehicles have great potential to create a better, safer future for all.”
Notwithstanding its best intentions, tweets abound showing GM’s AVs causing varying levels of havoc across Austin, such as the following tweet that shows one Cruise AV stopped in a lane of Martin Luther King, Jr. Boulevard, and another stopped near the intersection of West 15th Street and Trinity Street.
So, do you trust the robots? Does it matter? Maybe if we can’t beat them, we should join them? That’s exactly the approach many Austin citizens are taking. To try out a Cruise AV, you must join GM’s waitlist, which offers Cruise rides between 8 PM and 5:30 AM every day.
And if your AV experience doesn’t go as planned, remember, the reason robots are so insecure is because they only have artificial intelligence. Stay safe out there, and rust in peace.
Cobb & Johns are Special Forces for Complex Property and Government Disputes.