The Social Dilemma of Self-Driving Cars

Started by BridgeTroll, June 29, 2016, 08:13:37 AM

BridgeTroll

The ethics of self driving cars... AI... and robots is an interesting subject.  Here are a couple of good videos to help explore.

https://www.youtube.com/v/nBkQQ6czRJI

Here is one from TED...

https://www.youtube.com/v/ixIoDYVfKA0

In a boat at sea one of the men began to bore a hole in the bottom of the boat. On being remonstrating with, he answered, "I am only boring under my own seat." "Yes," said his companions, "but when the sea rushes in we shall all be drowned with you."

ProjectMaximus

Thanks for sharing BT! I never considered this.

BridgeTroll

It seems keeping within the lanes and obeying traffic signals and laws is the easy part!  8)
In a boat at sea one of the men began to bore a hole in the bottom of the boat. On being remonstrating with, he answered, "I am only boring under my own seat." "Yes," said his companions, "but when the sea rushes in we shall all be drowned with you."

Bridges

The trolley problem is such a red herring.  In the current state of auto mobile transportation, people are not equipped to even make this decision.  The reaction and awareness of self-driving cars is beyond any speed we can comprehend.  Constant and instant awareness.  The amount of times a self-driving car will be even involved in having to knowingly choose between pedestrians and occupants will be virtually nil.   

Our airplanes are autonomous.  To a slightly lesser extent, escalators, elevators are autonomous. 

The true Trolley problem is the one we conduct today, in this very problem.  Do we continue down the same path of death of thousands of people a year, countless amounts of money in property destruction, or do we swerve towards safety in the off chance that we have 1 scenario a year where an autonomous car has to face this dilemma?

One day in the future our kids or grandkids will ask us about cars:
kid: "Did you used to drive your own car"
Me: "Yes everyone did"
Kid" "Wasn't that dangerous?"
Me: "Oh, very.  Every day we would turn on the TV to learn where the accidents were,
        then we would get in the car and the radio would update us on where other new
        accidents were.  People died every day.  It was terrible."
So I said to him: Arthur, Artie come on, why does the salesman have to die? Change the title; The life of a salesman. That's what people want to see.

BridgeTroll

You still have to program them... no matter how often you think they will encounter a situation.  You say people are not equipped to make these decisions... yet have to make them every day.

Aircraft are certainly NOT autonomous.

I do not think either of those videos found the problems insurmountable... just that they are ethical problems that require more thought than you are apparently engaging in...
In a boat at sea one of the men began to bore a hole in the bottom of the boat. On being remonstrating with, he answered, "I am only boring under my own seat." "Yes," said his companions, "but when the sea rushes in we shall all be drowned with you."

JeffreyS

Lenny Smash

Sonic101

Currently, in cars that have a semi-autonomous mode (Tesla, BMW, Mercedes, Cadillac, Infiniti, Volvo) where it drives for you on freeways (conditions permitting), the car will come to a halt if something were to happen and you don't take over. Moving up to the whole idea of the car making life decisions is a big leap from the current technology, I would imagine this being debuted for production around or after 2025.

Side note: Cadillac calls this function 'Super Cruise' which I think is befitting and I will now always call this function that

finehoe

Tesla driver's death using car's 'Autopilot' probed by NHTSA

WASHINGTON — A driver so enamored of his Tesla Model S sedan that he nicknamed the car "Tessy" and praised the safety benefits of its sophisticated "Autopilot" system has become the first U.S. fatality in a wreck involving a car in self-driving mode.

The National Highway Traffic Safety Administration announced the driver's death Thursday, and said it is investigating the design and performance of the Autopilot system.

Joshua D. Brown of Canton, Ohio, the 40-year-old owner of a technology company, was killed May 7 in Williston, Florida, when his car's cameras failed to distinguish the white side of a turning tractor-trailer from a brightly lit sky and didn't automatically activate its brakes, according to statements by the government and the automaker. Just one month earlier, Brown had credited the Autopilot system for preventing a collision on an interstate.

Frank Baressi, 62, the driver of the truck and owner of Okemah Express LLC, said the Tesla driver was "playing Harry Potter on the TV screen" at the time of the crash and driving so quickly that "he went so fast through my trailer I didn't see him."

The movie "was still playing when he died and snapped a telephone pole a quarter mile down the road," Baressi told The Associated Press in an interview from his home in Palm Harbor, Florida. He acknowledged he didn't see the movie, only heard it.

Tesla Motors Inc. said it is not possible to watch videos on the Model S touch screen. There was no reference to the movie in initial police reports.

Brown's published obituary described him as a member of the Navy SEALs for 11 years and founder of Nexu Innovations Inc., working on wireless internet networks and camera systems. In Washington, the Pentagon confirmed Brown's work with the SEALs and said he left the service in 2008.

Brown was an enthusiastic booster of his 2015 Tesla Model S and in April praised its sophisticated Autopilot system for avoiding a crash when a commercial truck swerved into his lane on an interstate. He published a video of the incident online. "Hands down the best car I have ever owned and use it to its full extent," Brown wrote.

Tesla didn't identify Brown but described him in a statement as "a friend to Tesla and the broader EV (electric vehicle) community, a person who spent his life focused on innovation and the promise of technology and who believed strongly in Tesla's mission." It also stressed the uncertainty about its new system, noting that drivers must manually enable it: "Autopilot is getting better all the time, but it is not perfect and still requires the driver to remain alert."

A man answering the door at Brown's parents' house who did not identify himself said he had no comment.

Tesla founder Elon Musk expressed "our condolences for the tragic loss" in a tweet late Thursday.

Preliminary reports indicate the crash occurred when Baressi's rig turned left in front of Brown's Tesla at an intersection of a divided highway southwest of Gainesville, Florida, where there was no traffic light, NHTSA said. Brown died at the scene.

By the time firefighters arrived, the wreckage of the Tesla — with its roof sheared off completely — had come to rest in a nearby yard hundreds of feet from the crash site, assistant chief Danny Wallace of the Williston Fire Department told the AP.

Tesla said in a statement that this was the first known death in over 130 million miles of Autopilot operation. Before Autopilot can be used, drivers have to acknowledge that the system is an "assist feature" that requires a driver to keep both hands on the wheel at all time. Drivers are told they need to "maintain control and responsibility for your vehicle" while using the system, and they have to be prepared to take over at any time, the statement said.

Autopilot makes frequent checks, making sure the driver's hands are on the wheel, and it gives visual and audible alerts if hands aren't detected, and it gradually slows the car until a driver responds, the statement said.

The Autopilot mode allows the Model S sedan and Model X SUV to steer itself within a lane, change lanes and speed up or slow down based on surrounding traffic or the driver's set speed. It can automatically apply brakes and slow the vehicle. It can also scan for parking spaces and parallel park on command.

NHTSA said the opening of the preliminary evaluation by its defects investigation office shouldn't be construed as a finding that the government believes the Model S is defective.

Brown's death comes as NHTSA is taking steps to ease the way onto the nation's roads for self-driving cars, an anticipated sea-change in driving. Self-driving cars have been expected to be a boon to safety because they'll eliminate human errors. Human error is responsible for about 94 percent of crashes.

One of Tesla's advantages over competitors is that its thousands of cars feed real-world performance information back to the company, which can then fine-tune the software that runs Autopilot.

This is not the first time automatic braking systems have malfunctioned, and several have been recalled to fix problems. Last fall, Ford recalled 37,000 F-150 pickups because they braked with nothing in the way. The company said the radar could become confused when passing a large, reflective truck.

The technology relies on multiple cameras, radar, laser and computers to sense objects and determine if they are in the car's way, said Mike Harley, an analyst at Kelley Blue Book. Systems like Tesla's, which rely heavily on cameras, "aren't sophisticated enough to overcome blindness from bright or low contrast light," he said.

Harley called the death unfortunate, but said that more deaths can be expected as the autonomous technology is refined.

https://www.washingtonpost.com/politics/tesla-driver-killed-in-crash-while-using-cars-autopilot/2016/06/30/29d740b6-3f22-11e6-9e16-4cf01a41decb_story.html

BridgeTroll

I suppose it is fortunate he only killed himself...
In a boat at sea one of the men began to bore a hole in the bottom of the boat. On being remonstrating with, he answered, "I am only boring under my own seat." "Yes," said his companions, "but when the sea rushes in we shall all be drowned with you."

AviationMetalSmith

If the car is programmed to obey all laws, including the speed limit, then will the car will be moving slower than other traffic?

What if every car is tracked by GPS? What if a system based on the railroads Central Traffic Control (CTC) is enacted?

What if all vehicles had to travel single-file?

Will the programmer be responsible for vehicular homicide?

The programmers will have to think of something.

Maybe the Motorcyclists will be enclosed in a plastic pod, in the future?







BridgeTroll

http://www.scientificamerican.com/article/what-nasa-could-teach-tesla-about-autopilot-s-limits/

QuoteWhat NASA Could Teach Tesla about Autopilot's Limits
Decades of research have warned about the human attention span in automated cockpits
By John Pavlus on July 18, 2016

Tesla Motors says the Autopilot system for its Model S sedan "relieves drivers of the most tedious and potentially dangerous aspects of road travel." The second part of that promise was put in doubt by the fatal crash of a Model S earlier this year, when its Autopilot system failed to recognize a tractor-trailer turning in front of the vehicle. Tesla says the driver, Joshua Brown, also failed to notice the trailer in time to prevent a collision. The result? In Tesla's own words, "the brake was not applied"—and the car plowed under the trailer at full speed, killing Brown.

Since news of Brown's death broke in June, the public has been debating where the fault lies: with the driver, the company or the automation technology itself. But NASA has been studying the psychological effects of automation in cockpits for decades—and this body of research suggests that a combination of all three factors may be responsible. "If you think about the functionality of a cockpit, that could mean in an airplane, a space shuttle or a car," says Danette Allen, director of NASA Langley Research Center's Autonomy Incubator. "NASA, perhaps more than any other organization, has been thinking about autonomy and automation for a long time."
Stephen Casner, a research psychologist in NASA's Human Systems Integration Division, puts it more bluntly: "News flash: Cars in 2017 equal airplanes in 1983."

Casner is not just referring to basic mechanisms that keep the nose of the plane level, similar to cruise control in a car. He means, in his words, "the full package": true autonomous flight, from just after takeoff up to (and even including) landing. "The first Madonna album had not come out yet when we had this technology," Casner says. "And we are, 33 years later, having this very same conversation about cars."
Here are three things about how humans and automated vehicles behave together that NASA has known for years—and to which Tesla may need to pay more attention.

THE LIMITS OF BEING "ON THE LOOP"

People often use the phrase "in the loop" to describe how connected someone is (or is not) to a decision-making process. Fewer people know that this "control loop" has a specific name: Observe, Orient, Decide, Act (OODA). The framework was originally devised by a U.S. Air Force colonel, and being "in" and "out" of the OODA loop have straightforward meanings. But as automation becomes more prevalent in everyday life, an understanding of how humans behave in an in-between state—known as "on the loop"—will become more important.
Missy Cummings, a former Navy fighter pilot and director of Duke University's Humans and Autonomy Laboratory, defines "on the loop" as human supervisory control: "intermittent human operator interaction with a remote, automated system in order to manage a controlled process or task environment." Air traffic controllers, for example, are on the loop of the commercial planes flying in their airspace. And thanks to increasingly sophisticated cockpit automation, most of the pilots are, too.

Tesla compares Autopilot with this kind of on-the-loop aviation, saying it "functions like the systems that airplane pilots use when conditions are clear." But there's a problem with that comparison, Casner says: "An airplane is eight miles high in the sky." If anything goes wrong, a pilot usually has multiple minutes—not to mention emergency checklists, precharted hazards and the help of the crew—in which to transition back in the loop of control. (For more on this, see Steven Shladover's article, "What 'Self-Driving Cars Will Really Look Like," from the June 2016 Scientific American.)

Automobile drivers, for obvious reasons, often have much less time to react. "When something pops up in front of your car, you have one second," Casner says. "You think of a Top Gun pilot needing to have lightning-fast reflexes? Well, an ordinary driver needs to be even faster."

In other words, the everyday driving environment affords so little margin for error that any distinction between "on" and "in" the loop can quickly become moot. Tesla acknowledges this by constraining the circumstances in which a driver can engage Autopilot: "clear lane lines, a relatively constant speed, a sense of the cars around you and a map of the area you're traveling through," according to MIT Technology Review. But Brown's death suggests that, even within this seemingly conservative envelope, driving "on the loop" may be uniquely unforgiving.

THE LIMITS OF ATTENTION

Of course, ordinary human negligence can turn even the safest automation deadly. That's why Tesla says that Autopilot "makes frequent checks to ensure that the driver's hands remain on the wheel and provides visual and audible alerts if hands-on is not detected."
But NASA has been down this road before, too. In studies of highly automated cockpits, NASA researchers documented a peculiar psychological pattern: The more foolproof the automation's performance becomes, the harder it is for an on-the-loop supervisor to monitor it. "What we heard from pilots is that they had trouble following along [with the automation]," Casner says. "If you're sitting there watching the system and it's doing great, it's very tiring." In fact, it's extremely difficult for humans to accurately monitor a repetitive process for long periods of time. This so-called "vigilance decrement" was first identified and measured in 1948 by psychologist Robert Mackworth, who asked British radar operators to spend two hours watching for errors in the sweep of a rigged analog clock. Mackworth found that the radar operators' accuracy plummeted after 30 minutes; more recent versions of the experiment have documented similar vigilance decrements after just 15 minutes.

These findings expose a contradiction in systems like Tesla's Autopilot. The better they work, the more they may encourage us to zone out—but in order to ensure their safe operation they require continuous attention. Even if Joshua Brown was not watching Harry Potter behind the wheel, his own psychology may still have conspired against him.
According to some researchers, this potentially dangerous contradiction is baked into the demand for self-driving cars themselves. "No one is going to buy a partially-automated car [like Tesla's Model S] just so they can monitor the automation," says Edwin Hutchins, a MacArthur Fellow and cognitive scientist who recently co-authored a paper on self-driving cars with Casner and design expert Donald Norman. "People are already eating, applying makeup, talking on the phone and fiddling with the entertainment system when they should be paying attention to the road," Hutchins explains. "They're going to buy [self-driving cars] so that they can do more of that stuff, not less."

AUTOMATION AND AUTONOMY: NOT THE SAME THING

Tesla's approach to developing self-driving cars relies on an assumption that incremental advances in automation will one day culminate in "fully driverless cars." The National Highway Traffic Safety Administration (NHTSA) tacitly endorses this assumption in its four-level classification scheme for vehicle automation: Level 1 refers to "invisible" driver assistance like antilock brakes with electronic stability control. Level 2 applies to cars that combine two or more level 1 systems; a common example is adaptive cruise control combined with lane centering. Level 3 covers "Limited Self-Driving Automation" in cars like the Model S, where "the driver is expected to be available for occasional control but with sufficiently comfortable transition time." Level 3, warns Hutchins, "is where the problems are going to be"—but not because partial automation is inherently unsafe. Instead, he says, the danger lies in assuming that "Full Self-Driving Automation"—level 4 on NHTSA's scale—is a logical extension of level 3. "The NHTSA automation levels encourage people to think these are steps on the same path," Hutchins explains. "I think [level 3 automation] is actually going in a somewhat different direction."
Technology disruptors like Google and traditional carmakers like Ford and Volvo seem to agree. Both groups appear determined to sidestep level 3 automation entirely, because of its potential for inviting "mode confusion" in ambiguous situations. Mode confusion was made tragically famous by the Air France 447 disaster, in which pilots were unaware that the plane's fly-by-wire safety system had disengaged itself.

Given the state of research into automated vehicle operation—and the ongoing NHTSA investigation of Brown's crash—it is premature to fault either Tesla or Brown individually. And although any automated system that can log more than 200 million kilometers of driving without a fatality—as Autopilot has—is an amazing achievement, level 3 automation may simply possess properties that make it unsuitable for cars, even as it functions reliably in aviation and other contexts. But whereas understanding the psychological pitfalls around automation cannot bring Brown back, one hopes it might help prevent more deaths like his as self-driving cars continue to evolve.
In a boat at sea one of the men began to bore a hole in the bottom of the boat. On being remonstrating with, he answered, "I am only boring under my own seat." "Yes," said his companions, "but when the sea rushes in we shall all be drowned with you."