killamch89 Posted November 19 Share Posted November 19 With self-driving technology advancing rapidly, the idea of fully autonomous vehicles taking over the wheel is becoming a reality. Would you trust one to get you safely from point A to point B? What are your biggest concerns? safety, reliability, or something else? Or are you all-in, ready to relax and let the car do the work? Link to comment Share on other sites More sharing options...
Shagger Posted November 19 Share Posted November 19 (edited) The issue with autonomous vehicles is not thier ability to drive safely and accurately under normal circumstances. Technology exists right now that will make that very possible. The problem is how such technology responds to moral dilemmas. For example, you're in your autonomous car and the breaks fail. If you turn one way, you'll plough head on into oncoming triffic. Turn the other way you'll drive off the end of a high overpass to a certain death. Straight ahead of you are a bunch of school children crossing the road. What does the autonomous vehicle do? It sounds reasonable to assume the autonomous vehicle would be programmed to protect it's occupants, but with that logic it will deliberately plough straight through those children as that's the scenario the occupants are most likely to survive. But the vast majority of people would do whatever it takes to avoid that because it's the wrong choice morally. The only option for the makers of autonomous cars would be to have them programmed to make the correct choice morally, which in the above scenario is drive off the overpass, killing the person who bought it. That way the children are safe and no other lives are put at risk as with the other two options. That would mean that people who buy autonomous vehicles would buy them knowing that, under certain circumstances, they are programmed to kill you. Would you buy such a car? The point is, there is no way that autonomous vehicles can be designed that everyone, both inside and outside of them, can fully trust them. That is why they can't work. Edited November 19 by Shagger Link to comment Share on other sites More sharing options...
killamch89 Posted November 19 Author Share Posted November 19 29 minutes ago, Shagger said: The issue with autonomous vehicles is not thier ability to drive safely and accurately under normal circumstances. Technology exists right now that will make that very possible. The problem is how such technology responds to moral dilemmas. For example, you're in your autonomous car and the breaks fail. If you turn one way, you'll plough head on into oncoming triffic. Turn the other way you'll drive off the end of a high overpass to a certain death. Straight ahead of you are a bunch of school children crossing the road. What does the autonomous vehicle do? It sounds reasonable to assume the autonomous vehicle would be programmed to protect it's occupants, but with that logic it will deliberately plough straight through those children as that's the scenario the occupants are most likely to survive. But the vast majority of people would do whatever it takes to avoid that because it's the wrong choice morally. The only option for the makers of autonomous cars would be to have them programmed to make the correct choice morally, which in the above scenario is drive off the overpass, killing the person who bought it. That way the children are safe and no other lives are put at risk as with the other two options. That would mean that people who buy autonomous vehicles would buy them knowing that, under certain circumstances, they are programmed to kill you. Would you buy such a car? The point is, there is no way that autonomous vehicles can be designed that everyone, both inside and outside of them, can fully trust them. That is why they can't work. You make some valid points and I have to agree. I don't trust it either - as a matter of fact, I'd go as far as to say I have no faith in any fully autonomous vehicle to drive me anywhere safely. Link to comment Share on other sites More sharing options...