Metro Jacksonville

Community => Transportation, Mass Transit & Infrastructure => Topic started by: BridgeTroll on October 26, 2018, 10:40:45 AM

Title: The Morality of Self Driving Cars?
Post by: BridgeTroll on October 26, 2018, 10:40:45 AM
The "Trolley Problem" is a famous ethical dilemma about killing one person to save others. A group of MIT researchers recently applied it to the world of self-driving cars, posing a series of questions to more than 2 million online participants from more than 200 countries. The results reveal some regional preferences, but the overall consensus was clear: In the right situations, animals, the elderly, and small groups of people are in a lot of trouble.

Invented by British philosopher Phillippa Foot in 1967, the Trolley Problem uses hypothetical scenarios within extreme environments to test utilitarian and Aristotlean ethics. The most common version is of a driver of a trolley, forced to decide between staying on his track and killing five people, or switching tracks and killing only one. For this study, researchers at MIT's Moral Machine created 13 scenarios involving self-driving cars in an urban setting. Although the self driving-industry has debated the issue for years, some say too many years, the new study advances the debate by offering up something that computers can easily understand: big data.

Edmond Awad, a postdoc at MIT Media Lab and lead author of the paper, says that the researchers "found that there are three elements that people seem to approve of the most:"

sparing the lives of humans over the lives of animals;
sparing the lives of many people rather than a few; and
sparing the lives of young people rather than old.

To researchers, the popularity of the Moral Machine shows that people across the globe are eager to participate in the debate around self-driving cars and want to see algorithms that reflect their personal beliefs.

"On the one hand, we wanted to provide a simple way for the public to engage in an important societal discussion," says Iyad Rahwan, an associate professor of media arts and sciences at the Media Lab who worked on the study. "On the other hand, we wanted to collect data to identify which factors people think are important for autonomous cars to use in resolving ethical tradeoffs."

It's a discussion that Awad hopes continues. "What we have tried to do in this project," he says, "and what I would hope becomes more common, is to create public engagement in these sorts of decisions."

How would you program the vehicle?  Give it a try...  8)

http://moralmachine.mit.edu/
Title: Re: The Morality of Self Driving Cars?
Post by: Ocklawaha on October 31, 2018, 03:38:39 PM
As the trolley enters such a 'facing point switch' as soon as the front trucks have cleared the switch, they points are thrown to the other track. The result is called 'splitting the switch' ½ a trolley on track one, and ½ on track two. Derailed and STOPPED!

Then of course there are magnetic brakes which can clamp the rail so tight you can hold a cable car on a steep incline.

Okay, yes a wise A-- but really Jacksonville this is more a bus problem then a Trolley or Skyway MONORAIL...
Title: Re: The Morality of Self Driving Cars?
Post by: bl8jaxnative on November 04, 2018, 02:56:13 PM
The problem with the trolley problem is that it presumes a level of knowledge of a situation that, while commonly exists in Hollywood action movies, never exists in real life.   
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 04, 2018, 06:15:20 PM
The question is how do you program self driving vehicles of any type for public safety?
Title: Re: The Morality of Self Driving Cars?
Post by: Adam White on November 05, 2018, 04:46:47 AM
Quote from: bl8jaxnative on November 04, 2018, 02:56:13 PM
The problem with the trolley problem is that it presumes a level of knowledge of a situation that, while commonly exists in Hollywood action movies, never exists in real life.

That's not a problem, as the "trolley problem" isn't really about transit.
Title: Re: The Morality of Self Driving Cars?
Post by: JeffreyS on November 05, 2018, 06:40:37 AM
Kobioshi moru.
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 05, 2018, 09:53:58 AM
Quote from: Adam White on November 05, 2018, 04:46:47 AM
Quote from: bl8jaxnative on November 04, 2018, 02:56:13 PM
The problem with the trolley problem is that it presumes a level of knowledge of a situation that, while commonly exists in Hollywood action movies, never exists in real life.

That's not a problem, as the "trolley problem" isn't really about transit.
It is about choices and attitudes of those programming the machines.  Just what are the rules?  Are there standards?  Who makes them and who is liable?  If you don't care now... you will in the near future...
Title: Re: The Morality of Self Driving Cars?
Post by: Adam White on November 05, 2018, 09:58:07 AM
Quote from: BridgeTroll on November 05, 2018, 09:53:58 AM
Quote from: Adam White on November 05, 2018, 04:46:47 AM
Quote from: bl8jaxnative on November 04, 2018, 02:56:13 PM
The problem with the trolley problem is that it presumes a level of knowledge of a situation that, while commonly exists in Hollywood action movies, never exists in real life.

That's not a problem, as the "trolley problem" isn't really about transit.
It is about choices and attitudes of those programming the machines.  Just what are the rules?  Are there standards?  Who makes them and who is liable?  If you don't care now... you will in the near future...

I understand the trolley problem. It's about ethics and decision-making. And I understand why this is relevant when discussing autonomous cars.

But it's silly to claim it's flawed because the specific scenario posited in the problem never (or rarely) exists in real life. That's not the point.
Title: Re: The Morality of Self Driving Cars?
Post by: Tacachale on November 05, 2018, 12:05:52 PM
Quote from: Adam White on November 05, 2018, 09:58:07 AM
Quote from: BridgeTroll on November 05, 2018, 09:53:58 AM
Quote from: Adam White on November 05, 2018, 04:46:47 AM
Quote from: bl8jaxnative on November 04, 2018, 02:56:13 PM
The problem with the trolley problem is that it presumes a level of knowledge of a situation that, while commonly exists in Hollywood action movies, never exists in real life.

That's not a problem, as the "trolley problem" isn't really about transit.
It is about choices and attitudes of those programming the machines.  Just what are the rules?  Are there standards?  Who makes them and who is liable?  If you don't care now... you will in the near future...

I understand the trolley problem. It's about ethics and decision-making. And I understand why this is relevant when discussing autonomous cars.

But it's silly to claim it's flawed because the specific scenario posited in the problem never (or rarely) exists in real life. That's not the point.

The problem of the Monty Hall problem is that it presumes a goat-based game show prize structure which, while common in brain teasers, never existed on the actual Monty Hall show.

The problem with Plato's Allegory of the Cave because it presumes a situation where people are raised in a cave, which, while common in classical philosophical allegories, never exists in real life.
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 05, 2018, 12:15:53 PM
Hmmm... lets see what MIT says about the purpose of their study...

Quotehttp://moralmachine.mit.edu/
About Moral Machine

From self-driving cars on public roads to self-piloting reusable rockets landing on self-sailing ships, machine intelligence is supporting or entirely taking over ever more complex human activities at an ever increasing pace. The greater autonomy given machine intelligence in these roles can result in situations where they have to make autonomous choices involving human life and limb. This calls for not just a clearer understanding of how humans make such choices, but also a clearer understanding of how humans perceive machine intelligence making such choices.

Recent scientific studies on machine ethics have raised awareness about the topic in the media and public discourse. This website aims to take the discussion further, by providing a platform for 1) building a crowd-sourced picture of human opinion on how machines should make decisions when faced with moral dilemmas, and 2) crowd-sourcing assembly and discussion of potential scenarios of moral consequence.

Didya take the test?  How about a discussion about why you chose as you did...  ::)
Title: Re: The Morality of Self Driving Cars?
Post by: Snufflee on November 05, 2018, 01:33:07 PM
The Trolley Problem is more an exercise in mental gymnastics useful in Psychology/Philosophy 101 than a true tool in moral and ethical questions in programming autonomous algorithms. It isn't moral, it is actually immoral. It asks you to play "God" in choosing between who lives and who dies. It allows you to change uncontrollable actions to a controllable action.. who dies to an accident versus who do I kill. It is fatalism where every choice is a disaster. The situations it places you in are somewhere between unrealistic and ludicrous then says hey make a moral/ethical decision on murder.   We are placed as victims of our conditions, who face a binary choice with two horrendous outcomes.  The choices do not occur, as human moral choices actually do, as part of a chain of decision-making. Literally everything has been decided for us by an unseen external force, except who will die, which is conveniently left up to us.
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 05, 2018, 01:57:53 PM
Quote from: Snufflee on November 05, 2018, 01:33:07 PM
The Trolley Problem is more an exercise in mental gymnastics useful in Psychology/Philosophy 101 than a true tool in moral and ethical questions in programming autonomous algorithms. It isn't moral, it is actually immoral. It asks you to play "God" in choosing between who lives and who dies. It allows you to change uncontrollable actions to a controllable action.. who dies to an accident versus who do I kill. It is fatalism where every choice is a disaster. The situations it places you in are somewhere between unrealistic and ludicrous then says hey make a moral/ethical decision on murder.   We are placed as victims of our conditions, who face a binary choice with two horrendous outcomes.  The choices do not occur, as human moral choices actually do, as part of a chain of decision-making. Literally everything has been decided for us by an unseen external force, except who will die, which is conveniently left up to us.

So... how did you score in the MIT study?
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 05, 2018, 02:05:22 PM
I tended to favor those who were in the crosswalk.  I also tended to favor those who were legally in the crosswalk or had the green light.  I went with the "pedestrians always have the right of way" argument to justify my choices...  8)
Title: Re: The Morality of Self Driving Cars?
Post by: Adam White on November 05, 2018, 02:10:23 PM
Quote from: Tacachale on November 05, 2018, 12:05:52 PM
Quote from: Adam White on November 05, 2018, 09:58:07 AM
Quote from: BridgeTroll on November 05, 2018, 09:53:58 AM
Quote from: Adam White on November 05, 2018, 04:46:47 AM
Quote from: bl8jaxnative on November 04, 2018, 02:56:13 PM
The problem with the trolley problem is that it presumes a level of knowledge of a situation that, while commonly exists in Hollywood action movies, never exists in real life.

That's not a problem, as the "trolley problem" isn't really about transit.
It is about choices and attitudes of those programming the machines.  Just what are the rules?  Are there standards?  Who makes them and who is liable?  If you don't care now... you will in the near future...

I understand the trolley problem. It's about ethics and decision-making. And I understand why this is relevant when discussing autonomous cars.

But it's silly to claim it's flawed because the specific scenario posited in the problem never (or rarely) exists in real life. That's not the point.

The problem of the Monty Hall problem is that it presumes a goat-based game show prize structure which, while common in brain teasers, never existed on the actual Monty Hall show.

The problem with Plato's Allegory of the Cave because it presumes a situation where people are raised in a cave, which, while common in classical philosophical allegories, never exists in real life.

LOL!
Title: Re: The Morality of Self Driving Cars?
Post by: Snufflee on November 05, 2018, 02:37:32 PM
Quote from: BridgeTroll on November 05, 2018, 01:57:53 PM
Quote from: Snufflee on November 05, 2018, 01:33:07 PM
The Trolley Problem is more an exercise in mental gymnastics useful in Psychology/Philosophy 101 than a true tool in moral and ethical questions in programming autonomous algorithms. It isn't moral, it is actually immoral. It asks you to play "God" in choosing between who lives and who dies. It allows you to change uncontrollable actions to a controllable action.. who dies to an accident versus who do I kill. It is fatalism where every choice is a disaster. The situations it places you in are somewhere between unrealistic and ludicrous then says hey make a moral/ethical decision on murder.   We are placed as victims of our conditions, who face a binary choice with two horrendous outcomes.  The choices do not occur, as human moral choices actually do, as part of a chain of decision-making. Literally everything has been decided for us by an unseen external force, except who will die, which is conveniently left up to us.

So... how did you score in the MIT study?

I chose to run over the healthy adults and save the animals.
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 05, 2018, 03:05:42 PM
Quote from: Snufflee on November 05, 2018, 02:37:32 PM
Quote from: BridgeTroll on November 05, 2018, 01:57:53 PM
Quote from: Snufflee on November 05, 2018, 01:33:07 PM
The Trolley Problem is more an exercise in mental gymnastics useful in Psychology/Philosophy 101 than a true tool in moral and ethical questions in programming autonomous algorithms. It isn't moral, it is actually immoral. It asks you to play "God" in choosing between who lives and who dies. It allows you to change uncontrollable actions to a controllable action.. who dies to an accident versus who do I kill. It is fatalism where every choice is a disaster. The situations it places you in are somewhere between unrealistic and ludicrous then says hey make a moral/ethical decision on murder.   We are placed as victims of our conditions, who face a binary choice with two horrendous outcomes.  The choices do not occur, as human moral choices actually do, as part of a chain of decision-making. Literally everything has been decided for us by an unseen external force, except who will die, which is conveniently left up to us.

So... how did you score in the MIT study?

I chose to run over the healthy adults and save the animals.
Interesting...
Title: Re: The Morality of Self Driving Cars?
Post by: Tacachale on November 05, 2018, 03:59:21 PM
Quote from: BridgeTroll on November 05, 2018, 02:05:22 PM
I tended to favor those who were in the crosswalk.  I also tended to favor those who were legally in the crosswalk or had the green light.  I went with the "pedestrians always have the right of way" argument to justify my choices...  8)

I favored people in the crosswalk, not changing lanes, and those who were walking legally. I also went with saving people over animals. My justification was similar to yours - the people in the car have chosen that method of transit with its risks, so the car shouldn't make a change to kill pedestrians to save passengers. It got stickier for me when the choice was between which pedestrians to kill and which to save.
Title: Re: The Morality of Self Driving Cars?
Post by: Non-RedNeck Westsider on November 05, 2018, 09:14:30 PM
Quote from: Tacachale on November 05, 2018, 03:59:21 PM
Quote from: BridgeTroll on November 05, 2018, 02:05:22 PM
I tended to favor those who were in the crosswalk.  I also tended to favor those who were legally in the crosswalk or had the green light.  I went with the "pedestrians always have the right of way" argument to justify my choices...  8)

I favored people in the crosswalk, not changing lanes, and those who were walking legally. I also went with saving people over animals. My justification was similar to yours - the people in the car have chosen that method of transit with its risks, so the car shouldn't make a change to kill pedestrians to save passengers. It got stickier for me when the choice was between which pedestrians to kill and which to save.

Why is that stickier?  Are we basing this on the expectation that the 'car' will know the ages, social status and employment of the pedestrians?  No.

IMO, you can't take that into consideration and have to have the car continue on a straight path.
Title: Re: The Morality of Self Driving Cars?
Post by: Tacachale on November 06, 2018, 11:11:47 AM
Quote from: Non-RedNeck Westsider on November 05, 2018, 09:14:30 PM
Quote from: Tacachale on November 05, 2018, 03:59:21 PM
Quote from: BridgeTroll on November 05, 2018, 02:05:22 PM
I tended to favor those who were in the crosswalk.  I also tended to favor those who were legally in the crosswalk or had the green light.  I went with the "pedestrians always have the right of way" argument to justify my choices...  8)

I favored people in the crosswalk, not changing lanes, and those who were walking legally. I also went with saving people over animals. My justification was similar to yours - the people in the car have chosen that method of transit with its risks, so the car shouldn't make a change to kill pedestrians to save passengers. It got stickier for me when the choice was between which pedestrians to kill and which to save.

Why is that stickier?  Are we basing this on the expectation that the 'car' will know the ages, social status and employment of the pedestrians?  No.

IMO, you can't take that into consideration and have to have the car continue on a straight path.

When it comes to changing lanes and killing fewer people, vs. staying in the same lane and killing more people. I don't think the ages, social status, and employment of the pedestrians is important.
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 06, 2018, 11:21:01 AM
Quote from: Tacachale on November 06, 2018, 11:11:47 AM
Quote from: Non-RedNeck Westsider on November 05, 2018, 09:14:30 PM
Quote from: Tacachale on November 05, 2018, 03:59:21 PM
Quote from: BridgeTroll on November 05, 2018, 02:05:22 PM
I tended to favor those who were in the crosswalk.  I also tended to favor those who were legally in the crosswalk or had the green light.  I went with the "pedestrians always have the right of way" argument to justify my choices...  8)

I favored people in the crosswalk, not changing lanes, and those who were walking legally. I also went with saving people over animals. My justification was similar to yours - the people in the car have chosen that method of transit with its risks, so the car shouldn't make a change to kill pedestrians to save passengers. It got stickier for me when the choice was between which pedestrians to kill and which to save.

Why is that stickier?  Are we basing this on the expectation that the 'car' will know the ages, social status and employment of the pedestrians?  No.

IMO, you can't take that into consideration and have to have the car continue on a straight path.

When it comes to changing lanes and killing fewer people, vs. staying in the same lane and killing more people. I don't think the ages, social status, and employment of the pedestrians is important.

I think NRWs point is the car or trolly will not know so we should not consider it... I think MIT is looking for our biases and how would those biases affect the safety programming algorithms.  Who is responsible for the choice an automated machine makes?
Title: Re: The Morality of Self Driving Cars?
Post by: thelakelander on November 06, 2018, 11:30:01 AM
MIT should extend the parameters of their study. How about slowing the design speed of the road down, reducing the posted speed limit, meaning the car can stop before it gets to the crosswalk? How about crosswalks with RRFBs or physical/channelized or grade separation between motorized and non-motorized modes of travel? Or simply realizing a day where there will be 100% automation is a pipe dream right now. We can barely afford to repave streets right now. Who's going to fund the required infrastructure upgrades for a 100% AV world on ther roads? Who's going die on the sword politically to ban manual driving....yes, some people like act of being in control behind the wheel....
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 06, 2018, 12:39:32 PM
Quote from: thelakelander on November 06, 2018, 11:30:01 AM
MIT should extend the parameters of their study. How about slowing the design speed of the road down, reducing the posted speed limit, meaning the car can stop before it gets to the crosswalk? How about crosswalks with RRFBs or physical/channelized or grade separation between motorized and non-motorized modes of travel? Or simply realizing a day where there will be 100% automation is a pipe dream right now. We can barely afford to repave streets right now. Who's going to fund the required infrastructure upgrades for a 100% AV world on ther roads? Who's going die on the sword politically to ban manual driving....yes, some people like act of being in control behind the wheel....

Great questions and I think many of us know some of the answers already...  The sword question is one that I expect may happen piece by piece city by city with a city in California starting the process...
Title: Re: The Morality of Self Driving Cars?
Post by: Non-RedNeck Westsider on November 06, 2018, 03:31:58 PM
Quote from: BridgeTroll on November 06, 2018, 11:21:01 AM
I think NRWs point is the car or trolly will not know so we should not consider it... I think MIT is looking for our biases and how would those biases affect the safety programming algorithms.  Who is responsible for the choice an automated machine makes?

Pretty much.

And if the car/trolley with failing brakes continues rather than self-sacrifice, then the potential for more conflicts is infinite.

And I truly believe that for any system to truly work it has to be an all or nothing when/if it's implemented.  Similar to the 'Lexus lanes'...  You can opt to take a path that is 100% automated or you can take a path that is 100% manual.  I think when it's mixed, we would never get a true result of how effective it is or isn't.

I do know that my car has several features that use this technology, but only barely.  The self-adjusting cruise control is a nice feature as long as you're not in any kind of real traffic.  The automatic lane adjust sucks and I turn it off everytime I drive.  I haven't fully tested out the automatic braking, so can't really report if it works or not. 

If anyone wants to volunteer to test it out.  You can stand in the path of the car as I set the cruise at 30 and take my hands off the wheel and the pedals. I'll be happy to meet you somewhere, but I have some paperwork for you to sign first.   ;D
Title: Re: The Morality of Self Driving Cars?
Post by: thelakelander on November 06, 2018, 04:27:24 PM
Quote from: BridgeTroll on November 06, 2018, 12:39:32 PM
Great questions and I think many of us know some of the answers already...  The sword question is one that I expect may happen piece by piece city by city with a city in California starting the process...

That would be interesting, from a social equity issue alone since California is one of the more liberal political environments in the country. From a technical and transportation infrastructure standpoint, I predict growth and implementation of CVs way more likely and realistic than 100% AVs being forced on society anytime soon.
Title: Re: The Morality of Self Driving Cars?
Post by: Non-RedNeck Westsider on November 06, 2018, 06:20:16 PM
Quote from: Tacachale on November 06, 2018, 11:11:47 AM
Quote from: Non-RedNeck Westsider on November 05, 2018, 09:14:30 PM
Quote from: Tacachale on November 05, 2018, 03:59:21 PM
Quote from: BridgeTroll on November 05, 2018, 02:05:22 PM
I tended to favor those who were in the crosswalk.  I also tended to favor those who were legally in the crosswalk or had the green light.  I went with the "pedestrians always have the right of way" argument to justify my choices...  8)

I favored people in the crosswalk, not changing lanes, and those who were walking legally. I also went with saving people over animals. My justification was similar to yours - the people in the car have chosen that method of transit with its risks, so the car shouldn't make a change to kill pedestrians to save passengers. It got stickier for me when the choice was between which pedestrians to kill and which to save.

Why is that stickier?  Are we basing this on the expectation that the 'car' will know the ages, social status and employment of the pedestrians?  No.

IMO, you can't take that into consideration and have to have the car continue on a straight path.

When it comes to changing lanes and killing fewer people, vs. staying in the same lane and killing more people. I don't think the ages, social status, and employment of the pedestrians is important.

I obviously misunderstood your comment before.  I think that the car swerving to avoid quantity allows for certain unpredictability.  I understand in the model, all peds are assumed killed.  In a real-world situation that may or may not be the case.

In a completely macabre truth, it may be better for a malfunctioning car to aim for the larger crowds first in hopes of disabling the vehicle as quickly as possible and potentially preventing and further death from future conflict zones.
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on November 21, 2018, 10:35:53 AM
Algorithms are opinion...  8)

https://www.youtube.com/v/heQzqX35c9A

Title: Re: The Morality of Self Driving Cars?
Post by: bl8jaxnative on December 16, 2018, 11:53:22 AM
Quote from: Snufflee on November 05, 2018, 01:33:07 PM
The Trolley Problem is more an exercise in mental gymnastics useful in Psychology/Philosophy 101 than a true tool in moral and ethical questions in programming autonomous algorithms. It isn't moral, it is actually immoral. It asks you to play "God" in choosing between who lives and who dies. It allows you to change uncontrollable actions to a controllable action.. who dies to an accident versus who do I kill. It is fatalism where every choice is a disaster. The situations it places you in are somewhere between unrealistic and ludicrous then says hey make a moral/ethical decision on murder.   We are placed as victims of our conditions, who face a binary choice with two horrendous outcomes.  The choices do not occur, as human moral choices actually do, as part of a chain of decision-making. Literally everything has been decided for us by an unseen external force, except who will die, which is conveniently left up to us.

Thank you.  And this and other reason is why there is nothing moral nor immoral about robocars.  They just are.
Title: Re: The Morality of Self Driving Cars?
Post by: BridgeTroll on December 16, 2018, 04:16:24 PM
And thank you for completely missing the point... ::)