Vancouver Auto Chat 2016 VAC Community Head Moderator: Raid3n | |
06-18-2015, 09:38 AM
|
#1 | No Titles Given
Join Date: Feb 2005 Location: Vancouver
Posts: 1,570
Thanked 8 Times in 6 Posts
Failed 72 Times in 16 Posts
| Should Your Self-Driving Car Be Programmed To Kill?
Its been like a year since a thread touched on driver-less cars. Maybe a touchy topic on this site, as if it were a popularity contest. Might not even want to bring up the subject matter or just close your eyes if you have to. https://www.techdirt.com/articles/20...er-lives.shtml Quote:
Earlier this month Google announced that the company's self-driving cars have had just thirteen accidents since it began testing the technology back in 2009, none the fault of Google. The company has also started releasing monthly reports, which note Google's currently testing 23 Lexus RX450h SUVs on public streets, predominately around the company's hometown of Mountain View, California. According to the company, these vehicles have logged about 1,011,338 "autonomous" (the software is doing the driving) miles since 2009, averaging about 10,000 autonomous miles per week on public streets.
With this announcement about the details of these accidents Google sent a statement to the news media informing them that while Google self-driving cars do get into accidents, the majority of them appear to involve the cars getting rear ended at stoplights, at no fault of their own:
"We just got rear-ended again yesterday while stopped at a stoplight in Mountain View. That's two incidents just in the last week where a driver rear-ended us while we were completely stopped at a light! So that brings the tally to 13 minor fender-benders in more than 1.8 million miles of autonomous and manual driving—and still, not once was the self-driving car the cause of the accident."
If you're into this kind of stuff, the reports (pdf) make for some interesting reading, as Google tinkers with and tweaks the software to ensure the vehicles operate as safely as possible. That includes identifying unique situations at the perimeter of traditional traffic rules, like stopping or moving for ambulances despite a green light, or calculating the possible trajectory of two cyclists blotto on Pabst Blue Ribbon and crystal meth. So far, the cars have traveled 1.8 million miles (a combination of manual and automated driving) and have yet to see a truly ugly scenario.
Which is all immeasurably cool. But as Google, Tesla, Volvo and other companies tweak their automated driving software and the application expands, some much harder questions begin to emerge. Like, oh, should your automated car be programmed to kill you if it means saving the lives of a dozen other drivers or pedestrians? That's the quandary researchers at the University of Alabama at Birmingham have been pondering for some time, and it's becoming notably less theoretical as automated car technology quickly advances. The UAB bioethics team treads the ground between futurism and philosophy, and note that this particular question is rooted in a theoretical scenario known as the Trolley Problem:
"Imagine you are in charge of the switch on a trolley track. The express is due any minute; but as you glance down the line you see a school bus, filled with children, stalled at the level crossing. No problem; that's why you have this switch. But on the alternate track there's more trouble: Your child, who has come to work with you, has fallen down on the rails and can't get up. That switch can save your child or a bus-full of others, but not both. What do you do?"
What would a computer do? What should a Google, Tesla or Volvo automated car be programmed to do when a crash is unavoidable and it needs to calculate all possible trajectories and the safest end scenario? As it stands, Americans take around 250 billion vehicle trips killing roughly 30,000 people in traffic accidents annually, something we generally view as an acceptable-but-horrible cost for the convenience. Companies like Google argue that automated cars would dramatically reduce fatality totals, but with a few notable caveats and an obvious loss of control.
When it comes to literally designing and managing the automated car's impact on death totals, UAB researchers argue the choice comes down to utilitarianism (the car automatically calculates and follows through with the option involving the fewest fatalities, potentially at the cost of the driver) and deontology (the car's calculations are in some way tethered to ethics):
"Utilitarianism tells us that we should always do what will produce the greatest happiness for the greatest number of people," he explained. In other words, if it comes down to a choice between sending you into a concrete wall or swerving into the path of an oncoming bus, your car should be programmed to do the former.
Deontology, on the other hand, argues that "some values are simply categorically always true," Barghi continued. "For example, murder is always wrong, and we should never do it." Going back to the trolley problem, "even if shifting the trolley will save five lives, we shouldn't do it because we would be actively killing one," Barghi said. And, despite the odds, a self-driving car shouldn't be programmed to choose to sacrifice its driver to keep others out of harm's way."
Of course without some notable advancement in AI, the researchers note it's likely impossible to program a computer that can calculate every possible scenario and the myriad of ethical obligations we'd ideally like to apply to them. As such, it seems automated cars will either follow the utilitarian path, or perhaps make no choice at all (just shutting down when encountered with a no win scenario to avoid additional liability). Google and friends haven't (at least publicly) truly had this debate yet, but it's one that's coming down the road much more quickly than we think.
| |
| |
06-18-2015, 09:58 AM
|
#2 | To me, there is the Internet and there is RS
Join Date: Apr 2007 Location: Okanagan
Posts: 16,728
Thanked 9,413 Times in 4,098 Posts
Failed 427 Times in 225 Posts
|
One point I've never seen covered is liability in an at-fault accident, and this line at the end scares me a bit: Quote:
or perhaps make no choice at all (just shutting down when encountered with a no win scenario to avoid additional liability)
| If a self-driving car simply shuts off automation when it gets confused, the "driver" (operator?) is now in control, but pretty much guaranteed to crash. Does the liability for that crash go to the driver, who wasn't really controlling the thing, or the manufacturer, who is likely to be a bag of dicks to deal with?
And of those 1 million miles, what kind of driving was the car doing? And how well does this stuff work when bugs or snow and ice are covering the sensors? As much as I'm a fan of shitty drivers not driving, allowing them to drive less is just going to make them even shittier for the portion of the time that they do have to drive.
__________________ 1991 Toyota Celica GTFour RC // 2007 Toyota Rav4 V6 // 2000 Jeep Grand Cherokee
1992 Toyota Celica GT-S ["sold"] \\ 2007 Jeep Grand Cherokee CRD [sold] \\ 2000 Jeep Cherokee [sold] \\ 1997 Honda Prelude [sold] \\ 1992 Jeep YJ [sold/crashed] \\ 1987 Mazda RX-7 [sold] \\ 1987 Toyota Celica GT-S [crushed] Quote:
Originally Posted by maksimizer half those dudes are hotter than ,my GF. | Quote:
Originally Posted by RevYouUp reading this thread is like waiting for goku to charge up a spirit bomb in dragon ball z | Quote:
Originally Posted by Good_KarMa OH thank god. I thought u had sex with my wife. :cry: | |
| |
06-18-2015, 10:07 AM
|
#3 | I WANT MY 10 YEARS BACK FROM RS.net!
Join Date: Mar 2002 Location: ...
Posts: 20,300
Thanked 4,525 Times in 1,357 Posts
Failed 4,505 Times in 971 Posts
|
Nothing is perfect in this world.
However, ask yourself this question.
If you replace all the Richmond drivers with state of the art sophisticated automated vehicles, would the accident/death rate increase or decrease?
Who would you trust more? The state of the art technology or Richmond drivers?
|
| |
06-18-2015, 10:10 AM
|
#4 | I WANT MY 10 YEARS BACK FROM RS.net!
Join Date: Mar 2002 Location: ...
Posts: 20,300
Thanked 4,525 Times in 1,357 Posts
Failed 4,505 Times in 971 Posts
|
As for liability, it isn't it kind of same as Skytrain?
If Translink(owner) was not following manufacture's recommended maintenance schedule and caused an accident, I bet people will be suing Translink.
If the train speeds up instead of braking due to manufacture's defect, I bet the manufacture will be held responsible for the accident.
|
| |
06-18-2015, 12:12 PM
|
#5 | RS has made me the bitter person i am today!
Join Date: Feb 2013 Location: Vancouver
Posts: 4,865
Thanked 7,763 Times in 2,315 Posts
Failed 409 Times in 181 Posts
| Quote:
Originally Posted by Timpo If you replace all the Richmond drivers with state of the art sophisticated automated vehicles, would the accident/death rate increase or decrease?
Who would you trust more? The state of the art technology or Richmond drivers? | This has been asked before, even on RS.
This was the answer I think I gave last time:
In the aerospace industry 95% of all incidents are due to human error. Logically removing the human out of the equation would result in a significant decrease in incidents. HOWEVER, if I asked you to get on a plane which didn't have a pilot would you get on board?
|
| |
06-18-2015, 12:24 PM
|
#6 | To me, there is the Internet and there is RS
Join Date: Apr 2004 Location: Nanaimo
Posts: 16,486
Thanked 7,672 Times in 3,606 Posts
Failed 1,506 Times in 644 Posts
|
Can't wait till they open drive thru liquor stores.
__________________ Until the lions have their own historians, the history of the hunt will always glorify the hunter. |
| |
06-18-2015, 12:38 PM
|
#7 | I WANT MY 10 YEARS BACK FROM RS.net!
Join Date: Mar 2002 Location: ...
Posts: 20,300
Thanked 4,525 Times in 1,357 Posts
Failed 4,505 Times in 971 Posts
| Quote:
Originally Posted by meme405 This has been asked before, even on RS.
This was the answer I think I gave last time:
In the aerospace industry 95% of all incidents are due to human error. Logically removing the human out of the equation would result in a significant decrease in incidents. HOWEVER, if I asked you to get on a plane which didn't have a pilot would you get on board? | This has been talked about too.
They have been working on pilotless commercial airlines for many years. We might be just decades away from flying in pilotless planes - Business Insider http://money.cnn.com/2015/03/27/news...senger-planes/
The only problem they are having is fear of death.
This might make sense for cargo jets like Purolator and Fed-Ex.
|
| |
06-18-2015, 01:20 PM
|
#8 | 2x Variable Nockenwellen Steuerung
Join Date: Oct 2002 Location: N49.2 W122.1
Posts: 6,176
Thanked 1,174 Times in 704 Posts
Failed 67 Times in 51 Posts
|
Anything can be programmed the kill. However with robotic cars, I highly doubt an average person or even most people within -/+ 1 SD or a normal distribution can figure it out.
Let's not worry about human overpopulation on Mars.
|
| |
06-18-2015, 02:42 PM
|
#9 | To me, there is the Internet and there is RS
Join Date: Apr 2007 Location: Okanagan
Posts: 16,728
Thanked 9,413 Times in 4,098 Posts
Failed 427 Times in 225 Posts
| Quote:
Originally Posted by meme405 This has been asked before, even on RS.
This was the answer I think I gave last time:
In the aerospace industry 95% of all incidents are due to human error. Logically removing the human out of the equation would result in a significant decrease in incidents. HOWEVER, if I asked you to get on a plane which didn't have a pilot would you get on board? | Important question: What portion of human-error incidents are due to error by the pilot? Because a pilotless aircraft does nothing to fix errors caused by anyone other than pilots. Secondly, how many more non-pilot error incidents would end in fatalities without skilled pilots to correct them? I can think of several major aircraft incidents that would have been much more likely to end fatally, or would have ended fatally more quickly, had a human pilot not been at the controls. Quote:
Originally Posted by Manic! Can't wait till they open drive thru liquor stores. | They already exist, there's been one in Penticton for at least the last 20 years.
__________________ 1991 Toyota Celica GTFour RC // 2007 Toyota Rav4 V6 // 2000 Jeep Grand Cherokee
1992 Toyota Celica GT-S ["sold"] \\ 2007 Jeep Grand Cherokee CRD [sold] \\ 2000 Jeep Cherokee [sold] \\ 1997 Honda Prelude [sold] \\ 1992 Jeep YJ [sold/crashed] \\ 1987 Mazda RX-7 [sold] \\ 1987 Toyota Celica GT-S [crushed] Quote:
Originally Posted by maksimizer half those dudes are hotter than ,my GF. | Quote:
Originally Posted by RevYouUp reading this thread is like waiting for goku to charge up a spirit bomb in dragon ball z | Quote:
Originally Posted by Good_KarMa OH thank god. I thought u had sex with my wife. :cry: | |
| |
06-18-2015, 03:04 PM
|
#10 | I contribute to threads in the offtopic forum
Join Date: Mar 2004 Location: Vancouver
Posts: 2,799
Thanked 1,831 Times in 587 Posts
Failed 72 Times in 31 Posts
| Quote:
Originally Posted by underscore Important question: What portion of human-error incidents are due to error by the pilot? Because a pilotless aircraft does nothing to fix errors caused by anyone other than pilots. Secondly, how many more non-pilot error incidents would end in fatalities without skilled pilots to correct them? I can think of several major aircraft incidents that would have been much more likely to end fatally, or would have ended fatally more quickly, had a human pilot not been at the controls.
They already exist, there's been one in Penticton for at least the last 20 years. |
There have been many planes that have failed mechanically that the pilots have prevented them from crashing. Without the pilot they would have gone down.
Human error, with planes anyway, is not only just on the part of the pilot. The same could be said for the automated car. It requires maintenance to keep running, if its done improperly or at a low budget shop that cuts corners you could have a failure resulting in a crash.
|
| |
06-18-2015, 08:20 PM
|
#11 | To me, there is the Internet and there is RS
Join Date: Apr 2007 Location: Okanagan
Posts: 16,728
Thanked 9,413 Times in 4,098 Posts
Failed 427 Times in 225 Posts
| Quote:
Originally Posted by BoostedBB6 There have been many planes that have failed mechanically that the pilots have prevented them from crashing. Without the pilot they would have gone down.
Human error, with planes anyway, is not only just on the part of the pilot. The same could be said for the automated car. It requires maintenance to keep running, if its done improperly or at a low budget shop that cuts corners you could have a failure resulting in a crash. | Even at a high quality shop, occasionally a mistake will be made. If there is any flaw in the design, maintenance, or manufacturing process I like my chances a lot better with a flesh and blood pilot that is capable of dealing with anomalies that couldn't possibly be programmed for.
__________________ 1991 Toyota Celica GTFour RC // 2007 Toyota Rav4 V6 // 2000 Jeep Grand Cherokee
1992 Toyota Celica GT-S ["sold"] \\ 2007 Jeep Grand Cherokee CRD [sold] \\ 2000 Jeep Cherokee [sold] \\ 1997 Honda Prelude [sold] \\ 1992 Jeep YJ [sold/crashed] \\ 1987 Mazda RX-7 [sold] \\ 1987 Toyota Celica GT-S [crushed] Quote:
Originally Posted by maksimizer half those dudes are hotter than ,my GF. | Quote:
Originally Posted by RevYouUp reading this thread is like waiting for goku to charge up a spirit bomb in dragon ball z | Quote:
Originally Posted by Good_KarMa OH thank god. I thought u had sex with my wife. :cry: | |
| |
06-18-2015, 08:43 PM
|
#12 | Banned (ABWS)
Join Date: Sep 2013 Location: Vancouver
Posts: 2,452
Thanked 2,667 Times in 960 Posts
Failed 1,536 Times in 385 Posts
|
What if... what if you could program a google car to drive like a C-lai and the computer was actually housed in a fake paper-tissue box in the rear window?
You could sit in the back and take instagrams of all of the unhappy driver's faces!!!
|
| | |
Posting Rules
| You may not post new threads You may not post replies You may not post attachments You may not edit your posts HTML code is Off | | | All times are GMT -8. The time now is 07:16 PM. |