QOTD: Can You Blame Self-Driving for Crashes?

by


This afternoon’s story about a woman accused of killing two people while intoxicated behind the wheel of a Ford with BlueCruise is downright dystopian.

The woman’s defense is essentially that the car caused the crash, not the driver, since an autonomous system was activated.

Matt did a nice job laying it all out, including the philosophical and legal questions at hand. Since it’s a natural story for a QOTD, I wanted to piggyback off of that.

If you’re a lawyer, feel free to weigh in on the legal implications and questions.

I am very much not a lawyer, but I can at least think through the philosophical questions. The main one being — is the driver at fault if he or she activates an autonomous driving system and it fails?

Personally, I would say yes, especially given the state of the tech today. Once again, I must remind you that there are no truly self-driving cars on the road today. True self-driving would be Level 5 autonomy, and nothing on the market is at that level.

So, if a human is supposed to be paying attention and ready to intervene when the system fails, then he or she is still responsible, in my view. And, of course, should not be intoxicated behind the wheel.

It will get thornier if the industry ever achieves Level 5 autonomy, though. That’s a tougher call.

So, what say you? Sound off below.

[Image: Ford Motor Company]

Become a TTAC insider. Get the latest news, features, TTAC takes, and everything else that gets to the truth about cars first by  subscribing to our newsletter.



Source link