Some Self-driving Cars Are Already Meeting Accidents: Who Should Be Responsible?

car

What do Google, Audi, and Tesla have in common? These companies are already investing heavily in self-driving cars or autonomous vehicles. One shouldn’t be surprised considering its future demand.

According to Precedence Research, the market needs for these vehicles will achieve a compound annual growth rate (CAGR) of over 60% from 2020 to 2027. Meanwhile, Grand View Research forecasts that at least 4 million of these units will be sold by 2030.

However, even before companies can mass-produce them, autonomous cars are already a hot topic not really for their tech but more so for their safety. For motorists and pedestrians, it may even raise the question if they ever need a personal injury lawyer sometime soon.

Where the Problem Lies

From a personal injury standpoint, the problem lies in two things:

  • How safe are these vehicles?
  • If they ever meet an accident, who will be responsible?

Let’s begin with safety.

One of the common reasons for backing self-driving cars is safety. For many of its proponents, this is a safer option than vehicles driven by humans since people are prone to errors. In 2018 alone, over 35,000 people died because of motor vehicle crashes, according to the National Highway Traffic Safety Administration (NHSTA).

Speeding, driving under the influence, and distracted driving are the leading causes of these accidents—and they are all human-related and completely preventable.

These self-driving vehicles will come with fully automated safety features. These can include traffic jam assist, self-parking capabilities, rearview video systems, and a whole lot more. The driver then becomes another passenger, and they can be wasted or exhausted and not worry about crashing.

But unless these vehicles are already mass-produced, these supposed benefits may still be under theories. In fact, some that are already on the road have found themselves embroiled in crashes.

In 2018, a self-driving Uber car ended up killing a pedestrian in Tempe, Arizona, because the driver failed to pay attention to the road. Instead, she was busy watching a television show on her mobile device.

And this wasn’t the first accident. According to the National Transportation Safety Board, these vehicles had already figured themselves in over 35 crashes involving other vehicles and pedestrians.

Meanwhile, the Insurance Institute for Highway Safety has a grimmer prediction for the future of autonomous vehicles. Based on a study, over 60% of vehicle accidents would still occur even if there are self-driving cars on the road.

car

The reason is simple: driving is actually a complex task. The experts believe that these autonomous vehicles will perform well in conditions such as low visibility, distracted driving, and driving due to impairment or emergency.

However, unless these vehicles will develop the capability to make quick critical decisions or predict how other road users will move, accidents can still happen. Take, for example, a child who suddenly crosses the street or a sudden left turn in the corner.

For all these reasons, autonomous vehicles designed today are not 100% self-reliant. Instead, they make human driving optional. But then, this brings to the second issue: in case of an accident, who should be responsible?

The Liable Actors

The answer to the question isn’t simple but rather depends on a variety of factors. Consider the Tempe, Arizona, accident.

Some lawyers argued that the problem lies in its tech, particularly its software and sensors. The software, for instance, should have already predicted the distance and possible movement of the victim. Instead, it seemed to have failed to determine the pedestrian’s next move.

If this were the only issue, then the blame would have been directed to the manufacturer, which is Uber. But it said it wasn’t negligent. The sensors, for example, have been designed—and believed to have worked perfectly—during the accident, which occurred at night.

Fingers then point toward the victim herself, who, based on toxicology reports, had abnormal levels of illegal substance in her bloodstream. Based on this data alone, perhaps Uber wasn’t completely at fault.

But it wasn’t clear if she was under the influence when she drove her bicycle across the Uber car. Further, some believed she was at a considerable distance from the vehicle before she was struck despite the fact that she wasn’t looking at other cars passing by.

In the end, the backup driver because it seemed her distracted driving could have been the major factor that led to her death.

Regardless of the possible accidents that will involve self-automated vehicles, they will still be the future of cars. However, the complexity of determining who’s negligent in times of accidents may warrant clear state and federal policies even before they get busy on the road.

Like & Share

About The Author

Scroll to Top