Fully self-driving vehicles are being tested in various locations across the United States – including Texas.
While companies developing the technology are still convinced that increased automation will dramatically reduce accidents, the current technology has limitations, making autonomous vehicles (AV) very dangerous under certain circumstances.
Though Texas drivers may encounter a driverless vehicle on the roadways, accidents are still far more likely to be caused by human rather than computer error. So, are self-driving cars safe? The car accident lawyers at Farah Law can help you understand this new technology and what it means for you.
Automated Driving Assistance Technology Is Evolving
Increased safety is the primary goal of automated driving assistance (ADA) technologies. Certain levels of automation are currently considered safe for mass production, but no truly self-driving vehicles are available to the general public yet. The National HighwayTraffic Safety Administration (NHTSA) prefers the term ‘automated driving’ to ‘self-driving’ because the automation now available in US vehicles still requires a human to direct and monitor the system.
The automotive industry has developed a scale to describe the various levels of driving automation currently available. Human involvement is still primary at the lowest levels of automation, while a fully automated vehicle does not require human input.
- Level 0 – Momentary Driver Assistance: Automated warning systems – even emergency braking. Driver is responsible for all ongoing driving functions.
- Level 1 – Driver Assistance: The system provides continuous assistance with acceleration, braking, and steering. Driver is required to monitor.
- Level 2 – Additional Driver Assistance: System provides continuous assistance with acceleration, braking, and steering. Driver must monitor.
- Level 3 – Conditional Automation: System can handle all driving functions. A driver must be available to take over if the system can’t operate.
- Level 4 – High Automation: Systems operate within limited service areas and handle all driving tasks without human help.
- Level 5 – Full Automation: System is capable of managing all driving tasks on any roadway and under all kinds of driving conditions with no human intervention.
Most vehicles for sale in the US that offer a higher level of driver assistance are at level 2, including Tesla. The only major car maker currently offering level 3 automation in its vehicles sold in the US is Mercedes-Benz. However, Mercedes Drive Pilot technology is currently only available in California and Nevada.
Autonomous Vehicle Testing in Texas
Driverless vehicles (levels 4 and 5 automation) are being tested on Texas roadways. In 2017, the Texas Department of Transportation (TxDOT) authorized the operation of AVs and encouraged companies to develop their technology in the Lone Star State. However, AV operation is subject to state regulation only, and local governments are not authorized to regulate autonomous vehicles.
Austin, Dallas, and Houston are participating in the NHTSA’s AV TEST Initiative, which aims to improve safety and testing transparency. Participants agree to submit information about automated vehicles and testing to the NHTSA, which makes the information available to the public.
Autonomous vehicle company Waymo has been testing its self-driving vehicles in Austin since March 2024. It offers rides in driverless vehicles within a 37-square-mile area and has logged thousands of weekly rides. Sixteen incidents have been reported, primarily involving safety issues, with only two collisions.
In 2023, a driverless truck lane opened along 1-45 between Dallas and Houston. Aurora Innovation, a company developing self-driving semi-trucks, was set to begin testing its trucks without human oversight in 2024 but has pushed back its driverless semi launch to April 2025. Other self-driving truck technology companies are also taking advantage of the AV-friendly environment Texas offers.
Is Automated Control Safer Than Human Control?
Early studies suggest that self-driving cars may be safer than human-driven cars when faced with routine driving conditions. However, the data also shows that automated technology is limited by its programming and still needs further testing and refinement. Study results also show that autonomous vehicles are five times more likely to crash in low-light conditions, such as at dawn or dusk, than human-piloted vehicles.
AV relies on cameras for vision, and insufficient lighting has caused problems with accurate recognition, which has led to inappropriate responses. AV technology also seems to have trouble when making turns and has an accident rate twice that of human drivers. Left turns are especially challenging for AV because they involve a more complex evaluation of traffic conditions including vehicle speeds, object locations, and predicting when it is safe to execute the turn.
There Is Little Public Confidence in the Safety of Self-Driving Cars
The artificial intelligence (AI) operating a self-driving car is trained on large amounts of data about driving and how to respond to various traffic situations and road conditions. The information gathered by the vehicle’s sensors allows the AI to recognize the situation and predict the correct response based on what it has learned. Problems occur when a vehicle’s sensors provide inaccurate information or when the AI can’t arrive at the appropriate response because the situation presents unique circumstances.
Unfortunately, these kinds of errors have resulted in several fatal accidents, making the public wary of fully autonomous vehicles. A recent surveyby AAA found that most US drivers surveyed expressed either fear (66%) or uncertainty (25%) about driverless vehicles. Although confidence in autonomous vehicles has declined slightly in the last few years, most drivers are in favor of automated driving assistance, such as emergency braking and lane keeping.
According to the Pew Research Center, only 26% of Americans believe widespread use of driverless cars is a good idea. About 45% would not be comfortable sharing the road with self-driving cars. Women like the idea far less than men, and people over 50 would have a harder time accepting driverless vehicles than younger drivers.
What Happens When a Self-Driving Car Causes an Accident?
As the law in Texas now stands, an automated driving system is considered a licensed driver of a vehicle, and the owner of the automated driving system is considered the vehicle’s operator for purposes of applying traffic and motor vehicle laws – even if the owner is not present in the vehicle. So, if an AV runs a red light and causes an accident, the owner of the vehicle will be a starting place for liability.
However, since the software is supposed to be programmed to stop at red lights, the software developer or the vehicle manufacturer could also be responsible if some kind of error caused the malfunction. The law in this area will likely continue to develop as self-driving vehicles become more widely used.
Even Safe Self-Driving Cars Won’t Eliminate Accidents Any Time Soon
Automated driving assistance aims to improve traffic safety. Perhaps driverless cars will soon reduce the frequency and severity of crashes. Unfortunately, the crash rate is still unacceptably high in Texas. In 2023, someone was injured every 2 minutes and killed every 2 hours in traffic accidents on Texas roads.
The experienced personal injury attorneys at Farah Law know firsthand how dangerous Texas roads can be and have seen the damage caused by careless drivers. With over 40 years of experience, they have recovered millions of dollars on behalf of injured clients. Contact Farah Law today to arrange a free consultation and learn how we can help you recover compensation for injuries from a self-driving or any other type of car accident.