Reading Time: 5 minutes

Dallas drivers may have noticed some unusual vehicles moving through traffic lately. They look slightly futuristic: A spinning sensor on the roof. Cameras line the sides. Most notably, no one is in the driver’s seat. Those are autonomous vehicles from Waymo, a Google subsidiary company and a leader in the driverless “robotaxi” field. They already operate in cities like San Francisco and Austin, and officially opened for business in Dallas on Tuesday, February 24.

I spend a lot of time thinking about how new technology interacts with the law, so this rollout has my attention. It’s exciting, but it also raises questions that most people may not have considered before. Let’s unpack some of those..

What Is a Waymo Vehicle?

Waymo began as part of Google’s early self-driving car projects. Today it runs a ride-hailing service that uses fully autonomous vehicles in certain areas, including parts of Dallas. There is literally no one behind the wheel of a Waymo, as software steers it around.

Most of the Waymo fleet is modified Jaguar I-PACE electric SUVs. Each is packed with sensors that help it navigate its surroundings. Lidar uses pulses of laser light to measure distances. Radar tracks movement, speed, and direction of objects. Cameras read traffic lights, watch for pedestrians, and follow road signs. All of that data feeds into a computer that makes constant driving decisions in real time.

The biggest difference between these taxis and your typical rideshare is clear: There’s no human driver. We’re not talking about cruise control or lane assist, but rather full autonomous navigation around a mapped service area. You summon a ride through the app, an SUV with an empty driver’s seat pulls up to the curb, and away you go.

I’m not some giddy tech evangelist, but that’s pretty huge. For centuries, a living operator has controlled pretty much every wheeled object on the road. Our legal system grew around that idea, and when a crash happened, a human being was presumed responsible. With autonomous vehicles, that assumption no longer holds. If a Waymo crashes into you, you can’t get out and exchange insurance information with the driverless car.

Waymo Vehicles Still Crash.

It’s important to say this clearly: Waymo cars may statistically appear safer than cars with human drivers, but they still cause accidents.

The real world is messy, and computers can’t always keep up. Construction zones appear overnight. Traffic comes to a sudden halt. Cars dart into the road from alleys, and some pedestrians cross wherever they want. On top of all that, the tech itself can malfunction. Sensors can be blocked or damaged. Software may misread data. When you combine all of that, it’s clear that mistakes are still possible.

Consider Murphy’s Law: “If it can go wrong, it will.” Maybe that’s a little dramatic, but Waymo vehicles have been in crashes around the country—some minor, others more serious. In a few cases, investigators found the autonomous system took questionable action and caused the wreck.

So again: The technology is advanced but not flawless. As more robotaxis share the road with Dallas drivers, accidents will happen. When they do, the legal issues that follow quickly become complicated.

What Remedy Do I Have if a Waymo Injures Me?

If you’re injured by a Waymo, there’s no automatic right to compensation for your injuries. Legally speaking, you must negotiate a settlement directly with Waymo or prove the company’s negligence in court.

As a general rule, if your injuries are minor then it’s probably not worth the time and expense of a lawsuit. On the other hand, if you suffered serious injuries or lost a loved one because of a Waymo, it’s probably a good idea to contact an attorney.

But if Waymo’s fault is obvious, why involve a lawyer? Because legal matters are rarely simple, and lawsuits are no exception.

Proving Waymo’s Negligence Requires Understanding the Law and the Courts

No matter how egregious the mistake, Waymo cannot be forced to pay compensation unless a jury determines the company behaved negligently. Negligence consists of four parts:

  • Waymo’s vehicle injured or killed someone, causing damages.
  • You must show that Waymo owed you a legal duty.
  • Waymo failed to perform a duty to the victim;
  • Waymo’s failure to perform a duty directly led to your losses.

On top of that, proving these claims in court means filing a properly formatted lawsuit, meeting deadlines to demand and produce evidence, deposing witnesses, and thoroughly knowing case law. That’s too much to ask of most people, to say nothing of injured accident victims with a lot on their minds.

I’m not just saying that people with serious injuries need a lawyer because that’s what every lawyer says. Rather, I’ve seen just how complex these kind of cases can be and what it takes to succeed. Lucky as they are, people who haven’t gone toe-to-toe with a corporate defendant don’t understand how hard that is. Companies like Waymo fight tooth and nail against accepting liability for accidents. Let’s talk about how they do that.

How Can Waymo Dispute Liability for Accidents?

I’m not the kind of attorney who just throws everything at the wall and hopes something sticks to make a company liable. Those lawyers exist, though, so the law provides plenty of tools for companies to protect themselves. The problem is that companies use that whole toolbox against every potential claim, no matter its actual merits. Even if their product caused serious injury or death, they’ll force your attorney to make a compelling case.

But how would a company deflect blame if their employee (or driverless product) caused damages? Here are some arguments they might raise:

  • One likely argument is that the autonomous system wasn’t actually at fault. The vehicle’s many sensors could have data showing someone else ran a red light, sped, or otherwise drove unsafely. That data would be crucial evidence for both sides of a lawsuit, since it may paint the best picture of what occurred.
  • Second, they might argue comparative fault. Texas follows a modified comparative fault system, which determines obligation by percent of responsibility. For example, if the victim of a Waymo accident is found more than 50 percent responsible for it, they recover nothing. If they’re found less than 50 percent at fault, their compensation is still reduced by their share of the blame. If the defense can get a case thrown out or reduce the payout, of course they’ll try to shift fault elsewhere.
  • Finally, there’s the question of who built what. Most modern technology has a lot of cooks in the kitchen, relying on parts and software from many sources. So what caused a hypothetical Waymo wreck? Was it a faulty part? A programming bug? A hardware malfunction? Some combination? The answers could spread liability across multiple parties.

That’s not all the possible arguments, just some “greatest hits” of corporate defense. High coverage limits usually come with large defense teams and cutthroat strategies. Because of that, injured victims and their families shouldn’t expect a simple or quick resolution.

Welcome to Dallas, Waymo. We’ll Be Watching.

The arrival of Waymo vehicles in Dallas marks a real turning point. Self-driving cars are no longer a futuristic pipe dream. They’re here, sharing the road with us.

This technology could really be a “game changer” if developed carefully. Fewer drunk and/or distracted drivers could make the streets safer over time. But there’s also risk of trusting new technology too quickly. When things go wrong (Murphy’s Law again), the legal system becomes the place to sort out the effects.

Dallas is now part of that experiment. The vehicles may be cutting edge, but the questions after a crash are the same as they have always been. Who was at fault? Who pays? How do we make injured people whole? Those answers won’t come from sensors or software alone.

Explore cases we take