Self Driving Cars
#1
Self Driving Cars
I believe when I used to post more frequently on this forum, I had mentioned the dangers of US societies accepting - or not protesting - the sale and put on roads of self driving cars.
Any bonafide car enthusiast should view a machine driving his car for him as blasphemy against his car enthusiasm. If you allow me to put it that way...
by Matt McFarland @mattmcfarlandMarch 19, 2018: 1:40 PM ET
A driver was behind the wheel of the Volvo XC90 SUV at the time, the police said.
Uber pulls self-driving cars after first fatal crash of autonomous vehicle - Mar. 19, 2018
Any bonafide car enthusiast should view a machine driving his car for him as blasphemy against his car enthusiasm. If you allow me to put it that way...
Uber self-driving car kills pedestrian in first fatal autonomous crash
by Matt McFarland @mattmcfarlandMarch 19, 2018: 1:40 PM ET
Uber has removed its self-driving cars from the roads following what is believed to be the first fatality involving a fully autonomous car.
A self-driving Uber SUV struck and killed 49-year-old Elaine Herzberg as she walked her bicycle across a street in Tempe, Arizona, Sunday night, according to the Tempe police. The department is investigating the crash.A driver was behind the wheel of the Volvo XC90 SUV at the time, the police said.
Uber pulls self-driving cars after first fatal crash of autonomous vehicle - Mar. 19, 2018
#3
The lady was not in a crosswalk, BTW. We don't know how she walked across the street? Slowly, so the car could see her? Or bolted out? Too many unknowns. People seem to think an autonomous car should never have accidents. That is untrue. As long as human drivers coexist with autonomous drivers, accidents will still happen.
#4
This should probably be in Car Talk or something like that. But anyway, here's my
Autonomous car technology isn't the problem. With the big bucks being thrown at it by Google, Uber and Microsoft, it's only a matter of time (probably a short amount of time) before the technology is viable. What I believe will be the sticking point is liability.
We currently address auto liability by individual drivers having auto insurance, which covers the other party or themselves when an accident occurs. Relatively simple. Granted, insurance claims themselves can be a bit complicated but the system is set up in such a way that it's pretty easy to point the finger at one party or another (that party usually being the insurance company if you've paid your premiums).
The million dollar question is, who's liable with an autonomous car? If we're talking something like Tesla's autopilot, then it's still the driver. That system is meant only as an "assist" and is clearly described so. However, in a truly autonomous car that requires NO human input and clearly states it as such, it would be unreasonable to hold the owner accountable, and impossible in the case of the car being a taxi or Uber.
The only one I can see taking the liability is the manufacturer of the car. Obviously there's no manufacturer that is going to take full accountability of their cars' actions. In many situations, the cars computer may have to make a split second decision between two "bads" such as hitting the old lady with the walker, or running into a cyclist. That's extreme but with millions of cars in millions of different situations there's going to be situations exactly like that. If an auto manufacturer takes on that liability, they'll have a lot of lawsuits on their hands real quick.
Insurance companies could take on the risk. But how would you feel as a consumer if your automated car got into an accident (at no fault of your own) and your rates went up? There would be a world of pissing and moaning over something like that.
Anyway, I'm sure they'll figure something out but it's going to have to be different than what we're doing now.
Autonomous car technology isn't the problem. With the big bucks being thrown at it by Google, Uber and Microsoft, it's only a matter of time (probably a short amount of time) before the technology is viable. What I believe will be the sticking point is liability.
We currently address auto liability by individual drivers having auto insurance, which covers the other party or themselves when an accident occurs. Relatively simple. Granted, insurance claims themselves can be a bit complicated but the system is set up in such a way that it's pretty easy to point the finger at one party or another (that party usually being the insurance company if you've paid your premiums).
The million dollar question is, who's liable with an autonomous car? If we're talking something like Tesla's autopilot, then it's still the driver. That system is meant only as an "assist" and is clearly described so. However, in a truly autonomous car that requires NO human input and clearly states it as such, it would be unreasonable to hold the owner accountable, and impossible in the case of the car being a taxi or Uber.
The only one I can see taking the liability is the manufacturer of the car. Obviously there's no manufacturer that is going to take full accountability of their cars' actions. In many situations, the cars computer may have to make a split second decision between two "bads" such as hitting the old lady with the walker, or running into a cyclist. That's extreme but with millions of cars in millions of different situations there's going to be situations exactly like that. If an auto manufacturer takes on that liability, they'll have a lot of lawsuits on their hands real quick.
Insurance companies could take on the risk. But how would you feel as a consumer if your automated car got into an accident (at no fault of your own) and your rates went up? There would be a world of pissing and moaning over something like that.
Anyway, I'm sure they'll figure something out but it's going to have to be different than what we're doing now.
#5
I spend a lot of time in Tempe, right now, and I see these and the Waymo cars testing all over the place every day. I hate them, but the Uber ones are the only ones I've seen have problems. One of the Uber Volvos ended up on its side in an accident near ASU a few months back, too. I'm thinking Waymo (nee' Google) may be quite a bit ahead of Uber on this, at this point.
#6
It will be interesting to know the details of this fatality....
Self driving cars will still have to make decisions like humans would. Some of those decisions could kill people. Like for instance if a self driving car is going down the road and its about to pass a truck going in the opposite direction and an elderly lady on the sidewalk to the right. Suddenly a child runs out into the road in front of the car. Does the car go straight hit the kid, left hit the truck or right and hit the old lady. Self driving cars need to be programmed to make those decisions. Question is which car would you buy? The one that could endanger passengers by going left into the truck or the one that will kill the kid but not endanger the passengers.
Many questions...
Self driving cars will still have to make decisions like humans would. Some of those decisions could kill people. Like for instance if a self driving car is going down the road and its about to pass a truck going in the opposite direction and an elderly lady on the sidewalk to the right. Suddenly a child runs out into the road in front of the car. Does the car go straight hit the kid, left hit the truck or right and hit the old lady. Self driving cars need to be programmed to make those decisions. Question is which car would you buy? The one that could endanger passengers by going left into the truck or the one that will kill the kid but not endanger the passengers.
Many questions...
The following users liked this post:
losiglow (03-20-2018)
#7
Our infrastructure is too bad, in general, to have autonomous cars be as effective as they should be. We MUST fix our roads and figure out a way to make them foolproof for self driving cars before they are widely used. That's a tall order.
As a car enthusiast I can say that the only way I'd own an autonomous car is that the above conditions are met, manufactures perfect the algorithms, the technology is adopted by the masses (less self driving humans), and I'm old with bad eyes and/or reflexes and shouldn't be driving a vehicle.
As a car enthusiast I can say that the only way I'd own an autonomous car is that the above conditions are met, manufactures perfect the algorithms, the technology is adopted by the masses (less self driving humans), and I'm old with bad eyes and/or reflexes and shouldn't be driving a vehicle.
The following users liked this post:
RL09 (03-20-2018)
Trending Topics
#8
The person/company that owns and operates the autonomous car. When someone gets hurt on an amusement park ride (e.g. roller coaster) due to no fault of their own, isn't the amusement park liable?
I see it as the same thing.
I see it as the same thing.
#9
#10
This should probably be in Car Talk or something like that. But anyway, here's my
Autonomous car technology isn't the problem. With the big bucks being thrown at it by Google, Uber and Microsoft, it's only a matter of time (probably a short amount of time) before the technology is viable. What I believe will be the sticking point is liability.
We currently address auto liability by individual drivers having auto insurance, which covers the other party or themselves when an accident occurs. Relatively simple. Granted, insurance claims themselves can be a bit complicated but the system is set up in such a way that it's pretty easy to point the finger at one party or another (that party usually being the insurance company if you've paid your premiums).
The million dollar question is, who's liable with an autonomous car? If we're talking something like Tesla's autopilot, then it's still the driver. That system is meant only as an "assist" and is clearly described so. However, in a truly autonomous car that requires NO human input and clearly states it as such, it would be unreasonable to hold the owner accountable, and impossible in the case of the car being a taxi or Uber.
The only one I can see taking the liability is the manufacturer of the car. Obviously there's no manufacturer that is going to take full accountability of their cars' actions. In many situations, the cars computer may have to make a split second decision between two "bads" such as hitting the old lady with the walker, or running into a cyclist. That's extreme but with millions of cars in millions of different situations there's going to be situations exactly like that. If an auto manufacturer takes on that liability, they'll have a lot of lawsuits on their hands real quick.
Insurance companies could take on the risk. But how would you feel as a consumer if your automated car got into an accident (at no fault of your own) and your rates went up? There would be a world of pissing and moaning over something like that.
Anyway, I'm sure they'll figure something out but it's going to have to be different than what we're doing now.
Autonomous car technology isn't the problem. With the big bucks being thrown at it by Google, Uber and Microsoft, it's only a matter of time (probably a short amount of time) before the technology is viable. What I believe will be the sticking point is liability.
We currently address auto liability by individual drivers having auto insurance, which covers the other party or themselves when an accident occurs. Relatively simple. Granted, insurance claims themselves can be a bit complicated but the system is set up in such a way that it's pretty easy to point the finger at one party or another (that party usually being the insurance company if you've paid your premiums).
The million dollar question is, who's liable with an autonomous car? If we're talking something like Tesla's autopilot, then it's still the driver. That system is meant only as an "assist" and is clearly described so. However, in a truly autonomous car that requires NO human input and clearly states it as such, it would be unreasonable to hold the owner accountable, and impossible in the case of the car being a taxi or Uber.
The only one I can see taking the liability is the manufacturer of the car. Obviously there's no manufacturer that is going to take full accountability of their cars' actions. In many situations, the cars computer may have to make a split second decision between two "bads" such as hitting the old lady with the walker, or running into a cyclist. That's extreme but with millions of cars in millions of different situations there's going to be situations exactly like that. If an auto manufacturer takes on that liability, they'll have a lot of lawsuits on their hands real quick.
Insurance companies could take on the risk. But how would you feel as a consumer if your automated car got into an accident (at no fault of your own) and your rates went up? There would be a world of pissing and moaning over something like that.
Anyway, I'm sure they'll figure something out but it's going to have to be different than what we're doing now.
My issues are simple. 1. As a car enthusiast, and one who gets into my Audi or RL, even a Murano recently, and look forward to driving. Even after so many years, I do. Unless my faculties are compromised, I have no use of such a vehicle. And 2. I don't think self driving vehicles need to become a norm. They need to be for those who need them cause they got no other choice. But to drive around self-driving cars isn't safe in my opinion, and will never be. For a cab company to save on labor and buy a bunch is both unsafe and unethical. etc.
As a society, I don't see how the American one is just chalking this up to "google and others are spending the money, so its inevitable". I think this needs to be legislated onto a proper leash.
#11
Which actually sucks because the automotive companies are just going to make us pay more to foot the bill for insurance.
#12
I wonder how they're going to prevent people (i.e. competitor ride sharing/taxi company) from sabotaging the cars. When autonomous car is stopped at a red light, (masked) people could run up to it and let air out of tires.
#13
#15
And this overwhelming defense of self-driving cars, is a true head scratcher for me; especially on a car enthusiast website.
I just don't get it.
#16
which you have lots to learn.
it doesn't matter if the website was called enthusiast toasters unite.
Big technology advancements are for the better.
why put up with a shitty toaster when a toaster oven does it better and with more functionality?
make no mistake, toasters, like internal combustion engines aren't going anywhere.
the new better technology will just blend in with the sea of toasters, until one day...the old toasters will be junked.
sounds like the plot of "The Brave Little Toaster" (1987)
so, yes...eventually, self driving cars will take over....until then, there are plenty of drivers cars out there to enjoy. (Aston Martin, Porsche, AMG MB's, M-BMW's, and so forth) and they are not going anywher ANYTIME soon.
and Uber provided dashcam of the woman stepping out of a shadowy unprotected crosswalk. She would have came into contact with the car regardless of who was driving
#17
You're right it doesn't matter which board we're on. I simply imagine myself sitting in the driver seat and not driving the car... More so, that I'm driving and there's other cars around with a computer driving them. I suppose I'm either old fashioned or paranoid. Or I've watched too many scifi movies depicting the age coming where machines take over the world, and the Neo in me kicks in.
Here's what I believe: the human brain cannot be matched by a machine. The human senses cannot be replicated. And where we are imposing an environment where only logic makes sense, the reality is the possibilities are endless and often unbelievable ones take place, and that is simply due to a combination of the far more superior brain faculties to any machine we've been able to produce, and that, combined with our connected senses, both stand to deliver the most amazing of outcomes, that logic and computed robotics cannot replicate, and that woman may have had a different outcome.
Maybe I've just watched too many American style defy the odds kinda movies.
Here's what I believe: the human brain cannot be matched by a machine. The human senses cannot be replicated. And where we are imposing an environment where only logic makes sense, the reality is the possibilities are endless and often unbelievable ones take place, and that is simply due to a combination of the far more superior brain faculties to any machine we've been able to produce, and that, combined with our connected senses, both stand to deliver the most amazing of outcomes, that logic and computed robotics cannot replicate, and that woman may have had a different outcome.
Maybe I've just watched too many American style defy the odds kinda movies.
#18
it's not like this is happening over night.
there are billions and billions of dollars being thrown at it, with the smartest of the smarts working on it.
you and I will still be able to drive our cars in our lifetime. that's not going away
there are billions and billions of dollars being thrown at it, with the smartest of the smarts working on it.
you and I will still be able to drive our cars in our lifetime. that's not going away
#19
Many Americans love driving so yes, I believe we'll continue to have that "right" for a long time. Self driving cars could result in certainly legislation though. I wonder if in 50 years, a special license would be required in order to drive a car, rather than just being a passenger in a self-driving car. Sort of like concealed carry for guns. It was perfectly fine to carry a revolver in your jacket until the 1800's when states started coming up with CC laws. Now it's illegal to conceal a gun on you unless you have a permit.
Thread
Thread Starter
Forum
Replies
Last Post