Just thinking, it will really cheese off the road rage types. Do they attack the person in the seat, or the vehicle?
Printable View
Just thinking, it will really cheese off the road rage types. Do they attack the person in the seat, or the vehicle?
Its never going to be anything I'll have to think about. I just won't have anything to do with autonomous vehicles.I know a couple of people already have them and they get so frustrated they turn the systems off.
If you were an autonomous system manufacturer, would you be prepared to cop the repercussions of whatever problems your system causes, as if you were the driver of the vehicle?
For example, if one of your vehicles kills a cyclist and the legal entity is up on manslaughter charges, will you do any jail time?
If it isnt done that way, and if the driver isnt held responsible (both should be in my view), then injury and death becomes corporatised and monetised, and lives are just seen as sacrificial dollar signs.
No person is perfect, no company is perfect, no system is perfect. Car manufacturers already kill people by negligence, faulty ignition switches, faulty gearboxes, faulty alternators, faulty airbags etc etc. Don't let your burning desire for retributive justice cloud your thinking, it won't help work out the levels of liability involved. At first it will be a lawyer's picnic and the winners will be the lawyers not the people they're working for. Successful automated driving is too large a prize to not become widespread throughout the transport industry.
Unless a company deliberately designs in something like a "kill cyclists" option into their autonomous driving programming, all that will happen is a war of words and eventually an exchange of money. That's the way the world works. VW found out the cost of lying in bulk and has paid the price. However it's still profitable and continues to sell large amounts of vehicles.
I actually think you have made my point.
In emergency situations when human impacts and personal consequences can be high people go to great lengths to mitigate bad outcomes. Give that job to a corporate machine, and the care factor is potentially way lower.
We are already seeing this happenning with the way corporates like tesla are rolling out unproven technology for profit, at the expense of human lives.
I think they're taking a lead out of Microsoft's book. Why pay for beta testing, when you can get the public to pay you to do it?
Of course, Windows crashing is inconvenient, your Tesla crashing due to an "undocumented feature" is a whole other story!
Yes but it's always been this way, and most people aren't able to make rational split second decisions either. So long as autonomous cars are better than the inexperienced, the stupid, the angry, the tired, the arrogant, the stoned, the drunk, the ancient, the diabetic, the suddenly deceased drivers etc, then they will be of overall benefit to road safety.