QuoteTEMPE, Ariz. — A self-driving Uber vehicle struck and killed a pedestrian in a Phoenix suburb in the first fatality involving a fully autonomous test vehicle, prompting the ride-hailing company Monday to suspend all road-testing of such autos in the U.S. and Canada.
Depending on who is found to be at fault, the accident could have far-reaching consequences for the development of self-driving vehicles, which have been billed as potentially safer than human drivers.
The Volvo was in self-driving mode with a human operator behind the wheel when a woman walking outside a crosswalk in Tempe on Sunday night was hit, police said. The woman, identified as Elaine Herzberg, 49, died at a hospital.
Uber suspended all of its self-driving vehicle testing in the Phoenix area, Pittsburgh, San Francisco and Toronto.
Full article: http://www.jacksonville.com/business/20180319/woman-struck-and-killed-by-self-driving-uber-vehicle-in-arizona
Just reading the preliminary details, I'm not sure this woman would be alive even if the vehicle would had been human operated. Sounds like she pretty much pushed a bicycle across a busy street and directly into the path of a vehicle going 40 mph.
TERRIBLE story, it won't be the last like it, but ultimately, this tech is going to save a LOT of lives and make transportation a lot more convenient, accessible, and affordable for all.
Perhaps. My major question is how many decades before we can realistically expect many of the benefits associated with having roads where all vehicles are 100% driverless? There are a lot of bugs to work out and overcome. Some, like properly operating in the rain and snow, are pretty basic but challenging at this point. Once those bugs are out, then there's the transition period of integrating with human driven vehicles, the political football that will go along with that and the issue of properly addressing social equality.
This story will generate tons of (digital) ink, I have no doubt, with all of the doomsayers saying "See? We told you!" Yet, how many dozens of people have been run over by people playing with their phones while driving, and they shake their heads and say "What a shame. Oh, well, one can't mourn forever. Let's move on."
No doomsayers. Just some realist understanding there's a lot of testing and challenges still needed to be overcome to get anywhere close to some of the guaranteed benefits being tossed around. The reality is, we probably end up somewhere in the middle and if an when that happens, how do we really modify or infrastructure to best meet the demands of the future. For example, if pedestrian safety is important, further separation of motorized and non motorized modes of travel may still be the best way to go to address that specific issue.
Well the police say it was the pedestrians fault for darting across, in the middle of the street, in the dark, instead of using the lit crosswalk. The cameras on the vehicle corroborate this and the backup human driver did not see her at all until after impact.
Of course...given the autocentic design of our infrastructure, it's literally always the pedestrian's fault. However, in the industry we know streets should be better balanced between modes. Not knowing the context, perhaps this is an area that should have lower speeds and more mid-block crossings? This could be a situation where an AV remains or becomes a worse option for pedestrians than a human driven vehicle because the focus on what really improves safety is being misplaced with vehicle technology and not common sense?
Quote from: ProjectMaximus on March 20, 2018, 09:21:06 AM
Well the police say it was the pedestrians fault for darting across, in the middle of the street, in the dark, instead of using the lit crosswalk. The cameras on the vehicle corroborate this and the backup human driver did not see her at all until after impact.
Also, what's the chance that a back up driver pays less attention to the environment around them because of more reliance on technology to get it right? Lot's of grey areas and interpretations that will ultimately have to be decided in court (this won't be the last and only accident).
Quote from: thelakelander on March 20, 2018, 08:50:01 AM
No doomsayers. Just some realist understanding there's a lot of testing and challenges still needed to be overcome to get anywhere close to some of the guaranteed benefits being tossed around. The reality is, we probably end up somewhere in the middle and if an when that happens, how do we really modify or infrastructure to best meet the demands of the future. For example, if pedestrian safety is important, further separation of motorized and non motorized modes of travel may still be the best way to go to address that specific issue.
Don't get me wrong. I'm a pedestrian runner and cyclist who is in the roadway frequently. I am responsible for keeping an eye on vehicles and placing myself in situations where I can recognize danger well before it gets to me. Crossing the street in the dark with traffic present (while the traffic has the ROW) is NONE of that. If she had been hit by ANYTHING else other than an autonomous vehicle, this would have barely made the local news out there.
My point is, people will point at the vehicle as the cause of this crash, not the fact that she did just about everything wrong that a pedestrian could do.
My general point is that testing and challenges still exist and promises of increased safety aren't anywhere close to being achieved and will never be realized through a focus on autonomous vehicles alone. Just because the technology may be cool, it's still too early to consider it being perfect and without fault or the end all solution to improving multimodal safety and connectivity.
The few questions below are examples of important issues an AV will never resolve:
Why is the street not properly lit? Does a car need to travel at speeds of 45mph through a corridor where pedestrians are present (If the incident took place at 25mph would the pedestrian's chances at survival increase)? How far are crosswalks spaced apart and how safe are they? Are there pedestrian generators located mid-block that discourage the use of the crosswalk and encourage the use of crossing elsewhere? Should there be mid-block crossings and better signage to improve hostile conflict points? Is there a way to improve the road itself to reduce the number of conflict points.
Also, another part of me would take police reports at face value. In recent months, we've seen locally how the biases in reporting can play out.
Quote from: thelakelander on March 20, 2018, 10:45:26 AM
My general point is that testing and challenges still exist and promises of increased safety aren't anywhere close to being achieved and will never be realized through a focus on autonomous vehicles alone. Just because the technology may be cool, it's still too early to consider it being perfect and without fault or the end all solution to improving multimodal safety and connectivity.
The few questions below are examples of important issues an AV will never resolve:
Why is the street not properly lit? Does a car need to travel at speeds of 45mph through a corridor where pedestrians are present (If the incident took place at 25mph would the pedestrian's chances at survival increase)? How far are crosswalks spaced apart and how safe are they? Are there pedestrian generators located mid-block that discourage the use of the crosswalk and encourage the use of crossing elsewhere? Should there be mid-block crossings and better signage to improve hostile conflict points? Is there a way to improve the road itself to reduce the number of conflict points.
Also, another part of me would take police reports at face value. In recent months, we've seen locally how the biases in reporting can play out.
I believe we are on the same page, actually. Despite having a car that has much of this technology in it, I'm not ready to turn the roads over to completely autonomous vehicles for a very long time. The infrastructure simply is not there to support it. If the roads and traffic control devices were all built for it, that would be another story, but they're not.
I am absolutely appalled at the new TV ads being put out by Cadillac, showing drivers going down the highway while they pay attention to everything EXCEPT the road, implying that their car will be completely safe in hands-free (attention-free) mode. The only way this will be true (again) is if nearly ALL cars are doing the same thing AND they are interacting with the infrastructure at a much higher level than we see today.
Looking to the future, I'm most concerned with the road maintenance that AV's will require. How are we going to fund the extra maintenance to keep lane markings and signs well lit and clear when we can barely fund the roads they way they are now. I realize that we could program the cars to better understand what to do when little information is available, but are we willing to rely on that more than having clear road markings and signs?
Quote from: TimmyB on March 20, 2018, 11:13:07 AM
Quote from: thelakelander on March 20, 2018, 10:45:26 AM
My general point is that testing and challenges still exist and promises of increased safety aren't anywhere close to being achieved and will never be realized through a focus on autonomous vehicles alone. Just because the technology may be cool, it's still too early to consider it being perfect and without fault or the end all solution to improving multimodal safety and connectivity.
The few questions below are examples of important issues an AV will never resolve:
Why is the street not properly lit? Does a car need to travel at speeds of 45mph through a corridor where pedestrians are present (If the incident took place at 25mph would the pedestrian's chances at survival increase)? How far are crosswalks spaced apart and how safe are they? Are there pedestrian generators located mid-block that discourage the use of the crosswalk and encourage the use of crossing elsewhere? Should there be mid-block crossings and better signage to improve hostile conflict points? Is there a way to improve the road itself to reduce the number of conflict points.
Also, another part of me would take police reports at face value. In recent months, we've seen locally how the biases in reporting can play out.
I believe we are on the same page, actually. Despite having a car that has much of this technology in it, I'm not ready to turn the roads over to completely autonomous vehicles for a very long time. The infrastructure simply is not there to support it. If the roads and traffic control devices were all built for it, that would be another story, but they're not.
I am absolutely appalled at the new TV ads being put out by Cadillac, showing drivers going down the highway while they pay attention to everything EXCEPT the road, implying that their car will be completely safe in hands-free (attention-free) mode. The only way this will be true (again) is if nearly ALL cars are doing the same thing AND they are interacting with the infrastructure at a much higher level than we see today.
Yes, we are on the same page.
Quote from: TimmyB on March 20, 2018, 11:13:07 AM
Quote from: thelakelander on March 20, 2018, 10:45:26 AM
My general point is that testing and challenges still exist and promises of increased safety aren't anywhere close to being achieved and will never be realized through a focus on autonomous vehicles alone. Just because the technology may be cool, it's still too early to consider it being perfect and without fault or the end all solution to improving multimodal safety and connectivity.
The few questions below are examples of important issues an AV will never resolve:
Why is the street not properly lit? Does a car need to travel at speeds of 45mph through a corridor where pedestrians are present (If the incident took place at 25mph would the pedestrian's chances at survival increase)? How far are crosswalks spaced apart and how safe are they? Are there pedestrian generators located mid-block that discourage the use of the crosswalk and encourage the use of crossing elsewhere? Should there be mid-block crossings and better signage to improve hostile conflict points? Is there a way to improve the road itself to reduce the number of conflict points.
Also, another part of me would take police reports at face value. In recent months, we've seen locally how the biases in reporting can play out.
I believe we are on the same page, actually. Despite having a car that has much of this technology in it, I'm not ready to turn the roads over to completely autonomous vehicles for a very long time. The infrastructure simply is not there to support it. If the roads and traffic control devices were all built for it, that would be another story, but they're not.
I am absolutely appalled at the new TV ads being put out by Cadillac, showing drivers going down the highway while they pay attention to everything EXCEPT the road, implying that their car will be completely safe in hands-free (attention-free) mode. The only way this will be true (again) is if nearly ALL cars are doing the same thing AND they are interacting with the infrastructure at a much higher level than we see today.
The one where the guy crosses his arms or the one where the guy drinks from a soda bottle? I saw that as being better than what Tesla did giving theirs a bad name and telling everyone: "Good luck, it's in beta btw." The Cadillac system will also literally not let you take your eyes off the road or it turns off, which seems annoying but that's probably the point.
Quote from: Sonic101 on March 20, 2018, 11:34:24 AM
Quote from: TimmyB on March 20, 2018, 11:13:07 AM
Quote from: thelakelander on March 20, 2018, 10:45:26 AM
My general point is that testing and challenges still exist and promises of increased safety aren't anywhere close to being achieved and will never be realized through a focus on autonomous vehicles alone. Just because the technology may be cool, it's still too early to consider it being perfect and without fault or the end all solution to improving multimodal safety and connectivity.
The few questions below are examples of important issues an AV will never resolve:
Why is the street not properly lit? Does a car need to travel at speeds of 45mph through a corridor where pedestrians are present (If the incident took place at 25mph would the pedestrian's chances at survival increase)? How far are crosswalks spaced apart and how safe are they? Are there pedestrian generators located mid-block that discourage the use of the crosswalk and encourage the use of crossing elsewhere? Should there be mid-block crossings and better signage to improve hostile conflict points? Is there a way to improve the road itself to reduce the number of conflict points.
Also, another part of me would take police reports at face value. In recent months, we've seen locally how the biases in reporting can play out.
I believe we are on the same page, actually. Despite having a car that has much of this technology in it, I'm not ready to turn the roads over to completely autonomous vehicles for a very long time. The infrastructure simply is not there to support it. If the roads and traffic control devices were all built for it, that would be another story, but they're not.
I am absolutely appalled at the new TV ads being put out by Cadillac, showing drivers going down the highway while they pay attention to everything EXCEPT the road, implying that their car will be completely safe in hands-free (attention-free) mode. The only way this will be true (again) is if nearly ALL cars are doing the same thing AND they are interacting with the infrastructure at a much higher level than we see today.
The one where the guy crosses his arms or the one where the guy drinks from a soda bottle? I saw that as being better than what Tesla did giving theirs a bad name and telling everyone: "Good luck, it's in beta btw." The Cadillac system will also literally not let you take your eyes off the road or it turns off, which seems annoying but that's probably the point.
I've never seen the Tesla commercial, but living in Michigan (where Teslas barely exist) until last October would probably account for the lack of advertising impact.
Quote from: KenFSU on March 19, 2018, 09:44:29 PM
Just reading the preliminary details, I'm not sure this woman would be alive even if the vehicle would had been human operated. Sounds like she pretty much pushed a bicycle across a busy street and directly into the path of a vehicle going 40 mph.
TERRIBLE story, it won't be the last like it, but ultimately, this tech is going to save a LOT of lives and make transportation a lot more convenient, accessible, and affordable for all.
Developed by the same people who have given (actually sold) beta software with tons of bugs for decades now ? That's a scary thought....
The problem with the technology is, at best, it can't tell a human from a sign painted to look like a human. It has no ability to anticipate human movements. A heat sensing radar is being adapted but that is going to add more than $50k per vehicle to the price and it is incapable of telling a dog from a human. With 5-independent Lidar systems @ $35k each, plus the needed improvements these things are not ready to mix it up with crazy traffic just yet. So far one dead driver in Florida because the technology couldn't tell the suns reflection off a semi didn't equal an open road. Another flipped on its side in a not-at-fault collision with bumps and bruises.
Low speed, exclusive downtown, curbside transit lanes? Maybe...
Mixed traffic? Not yet.
Skyway? Insanity!
QuoteDon't Let the First Pedestrian Death by Uber's Self-Driving Car Freak You Out
A pedestrian in Arizona was killed by a self-driving Uber.
Tracey Lindeman
Mar 20 2018, 9:45am
Nearly 6,000 pedestrians were hit and killed by cars in the US last year—an increase of nine percent from the previous year, and the highest number since 1990, according to the National Highway Traffic Safety Administration. Pedestrians aren't alone; drivers, motorcyclists, and cyclists are all being killed with greater frequency on American roads. In 2016, 37,461 people died in traffic accidents in the US.
https://motherboard.vice.com/en_us/article/mbxjyv/autonomous-uber-kills-pedestrian-in-tempe-arizona-self-driving-car
According to the local investigators, it wouldn't have made a difference who or what was driving that car. The woman crossed in the dark by suddenly moving out into the street in front of the vehicle(at a place where there were signs stating to NOT cross the road in that place). The car was travelling 40, although the speed limit was 35.
What this suggest is that the claims of increased safety can't really be realized in an environment where human unpredictability exists. Local investigation can blame the pedestrian all it wants but the reality is that we design streets for vehicles and not people. The environment developed is a significant factor in multimodal safety and it can't be ignored or overcome via technology alone.
Unfortunately, the natural reaction is for naysayers to suggest this is why AVs don't work and for rose colored glass wearing backers to wrap themselves around why the pedestrian is at fault. What remains true is that many of the benefits AVs are being promoted to bring simply can't in an environment where there's still human interaction.
In addition, none of these links regarding this incident even point to or attempt to resolve the real issues that improve safety between motorized and non-motorized modes. High vehicular speed (drop that posted speed limit down to 25mph and the severity of a crash drops significantly....heck maybe at a lower speed, the vehicle does have time to stop), autocentric context and infrastructure design, lack of crosswalk locations designed for natural human movement, poor lighting and visibility, etc. These things aren't as sexy as the idea of self-driving vehicles but they are the things that actually improve safety, encourage economic development, improves multimodal connectivity and enhances a community's quality of life. They make up the backbone of a true "smart city".
Here you go Lake... 8)
QuoteYet beyond the personal tragedy, the events in Phoenix do nothing to change the broader context that existed the day before the collision happened: American roads are deeply unsafe, transformative technology is coming whether we like it or not, and the public must now enter a complex but essential debate about how we should integrate AV technology within the places where we live, work, and play.
https://www.brookings.edu/blog/the-avenue/2018/03/20/what-ubers-autonomous-vehicle-fatality-tells-us-about-the-future-of-place/?utm_source=feedblitz&utm_medium=FeedBlitzRss&utm_campaign=brookingsrss/topfeeds/latestfrombrookings
Quote from: thelakelander on March 21, 2018, 09:15:57 AM
What this suggest is that the claims of increased safety can't really be realized in an environment where human unpredictability exists. Local investigation can blame the pedestrian all it wants but the reality is that we design streets for vehicles and not people. The environment developed is a significant factor in multimodal safety and it can't be ignored or overcome via technology alone.
Unfortunately, the natural reaction is for naysayers to suggest this is why AVs don't work and for rose colored glass wearing backers to wrap themselves around why the pedestrian is at fault. What remains true is that many of the benefits AVs are being promoted to bring simply can't in an environment where there's still human interaction.
In addition, none of these links regarding this incident even point to or attempt to resolve the real issues that improve safety between motorized and non-motorized modes. High vehicular speed (drop that posted speed limit down to 25mph and the severity of a crash drops significantly....heck maybe at a lower speed, the vehicle does have time to stop), autocentric context and infrastructure design, lack of crosswalk locations designed for natural human movement, poor lighting and visibility, etc. These things aren't as sexy as the idea of self-driving vehicles but they are the things that actually improve safety, encourage economic development, improves multimodal connectivity and enhances a community's quality of life. They make up the backbone of a true "smart city".
Excellent post.
Excellent excellent point, Lake.
Video is out of the moments up to the collision:
https://twitter.com/TempePolice/status/976585098542833664
Honestly, it looks like the pedestrian was in the wrong here. Crossing in an area with no street lights, no crosswalk, in darkness with no reflective material or artificial light, and apparently paying no attention to the approaching headlights. Nobody ordered this person to cross here, or ignore the vehicle moving towards them.
While I understand Lake's thoughts on improving the city to be safer, I don't see how it isn't the pedestrian's responsibility to be mindful of the environment around them and its changes, because we're not at the point where the changes to improve pedestrian safety have been made yet. And especially at night, while illegally crossing, with no safety equipment.
Of course, Uber will need to carefully consider their next move, especially in the technology they choose in trying to make self-driving an economical reality. This is as much a PR learning experience as it is a technological learning experience, and they better learn a lot, because people won't tolerate continuous high-profile accidents involving fatalities.
Wow, take a look at the context. There's a sidewalk that dead ends into the street sidewalk mid-block and there's two sidewalks in the median (although there are signs that have been added after-the-fact to use the crosswalk). There's also a bus stop mid-block, which it where it appears the pedestrian possibly came from. Then the lighting is pretty poor. The context definitely creates a situation where you'll have pedestrians crossing at the point. We can blame the pedestrian, but publicly we set up a mix of pedestrian generators that will naturally cause a pedestrian to cross mid-block than travel 500 feet out of the way to cross at a crosswalk spanning seven lanes.
(https://static01.nyt.com/newsgraphics/2018/03/20/self-driving-uber-death/2c7f6d822fd34bb257293061ffccfdbcf5d74517/accident-diagram-1050.png)
Also, I hadn't realized she had already crossed four 12-foot travel and turn lanes. I did suspect the driver wasn't paying complete attention because of an over-reliance on self driving technology. Now looking at it, I don't think any type of color clothing would have worked. Better lighting would have made her easier to see, considering she didn't just teleport herself across four lanes before hit. Also, if they don't want pedestrians crossing at the location, it would have made better sense to not install a paved sidewalk in the median, not place the bus stop in that particular location and instead make the median inaccessible through pedestrian channelization (given the location of the bus stop/sidewalk to the Marquee Theatre.
QuoteThe video shows that the safety driver, identified by police as Rafael Vasquez, was clearly distracted and looking down from the road.
It also appears that both of the safety driver's hands were not hovering above the steering wheel, which is what most backup drivers are instructed to do because it allows them to take control of the car quickly in the case of an emergency.
Earlier in the week, police officials said the driver was not impaired and had cooperated with authorities. The self-driving car, however, should have detected the woman crossing the road.
Like many self-driving cars, Uber equips its vehicles with lidar sensors -- an acronym for light detection and ranging systems -- to help the car detect the world around it. One of the positive attributes of Lidar is that it is supposed to work well at night when it is dark, detecting objects from hundreds of feet away.
The accident was a reminder that self-driving technology is still in the experimental stage, as Silicon Valley giants, major automakers and other companies race to develop vehicles that can drive on their own.
QuoteLike many self-driving cars, Uber equips its vehicles with lidar sensors -- an acronym for light detection and ranging systems -- to help the car detect the world around it. One of the positive attributes of Lidar is that it is supposed to work well at night when it is dark, detecting objects from hundreds of feet away.
Full article: https://www.nytimes.com/interactive/2018/03/20/us/self-driving-uber-pedestrian-killed.html
Quote from: marcuscnelson on March 22, 2018, 01:08:53 AM
Video is out of the moments up to the collision:
https://twitter.com/TempePolice/status/976585098542833664
Honestly, it looks like the pedestrian was in the wrong here. Crossing in an area with no street lights, no crosswalk, in darkness with no reflective material or artificial light, and apparently paying no attention to the approaching headlights. Nobody ordered this person to cross here, or ignore the vehicle moving towards them.
While I understand Lake's thoughts on improving the city to be safer, I don't see how it isn't the pedestrian's responsibility to be mindful of the environment around them and its changes, because we're not at the point where the changes to improve pedestrian safety have been made yet. And especially at night, while illegally crossing, with no safety equipment.
Of course, Uber will need to carefully consider their next move, especially in the technology they choose in trying to make self-driving an economical reality. This is as much a PR learning experience as it is a technological learning experience, and they better learn a lot, because people won't tolerate continuous high-profile accidents involving fatalities.
Jaywalking should not be a death sentence. Obviously the LIDAR system on the car failed to detect the pedestrian for some reason, whether it be a software system failure or a some form of mechanical failure. The vehicle should have easily detected the obstruction and at least started to brake the car. There appears to be no evidence of the car breaking at all. If the vehicle cannot do this basic task, then they should not be allowed on the road period. The safety driver was sloppy and should be held responsible as well, that's what he/she is getting paid for not to what I suspect "is looking at a phone". At least we know now that she didn't dart in front of the vehicle.
Quote from: civil42806 on March 22, 2018, 09:35:20 AM
Quote from: marcuscnelson on March 22, 2018, 01:08:53 AM
Video is out of the moments up to the collision:
https://twitter.com/TempePolice/status/976585098542833664
Honestly, it looks like the pedestrian was in the wrong here. Crossing in an area with no street lights, no crosswalk, in darkness with no reflective material or artificial light, and apparently paying no attention to the approaching headlights. Nobody ordered this person to cross here, or ignore the vehicle moving towards them.
While I understand Lake's thoughts on improving the city to be safer, I don't see how it isn't the pedestrian's responsibility to be mindful of the environment around them and its changes, because we're not at the point where the changes to improve pedestrian safety have been made yet. And especially at night, while illegally crossing, with no safety equipment.
Of course, Uber will need to carefully consider their next move, especially in the technology they choose in trying to make self-driving an economical reality. This is as much a PR learning experience as it is a technological learning experience, and they better learn a lot, because people won't tolerate continuous high-profile accidents involving fatalities.
Jaywalking should not be a death sentence. Obviously the LIDAR system on the car failed to detect the pedestrian for some reason, whether it be a software system failure or a some form of mechanical failure. The vehicle should have easily detected the obstruction and at least started to brake the car. There appears to be no evidence of the car breaking at all. If the vehicle cannot do this basic task, then they should not be allowed on the road period. The safety driver was sloppy and should be held responsible as well, that's what he/she is getting paid for not to what I suspect "is looking at a phone". At least we know now that she didn't dart in front of the vehicle.
Jaywalking shouldn't be a death sentence, but it's going to be because that's how physics works. I don't disagree on the likelihood of there having been a system failure, but the conditions so highly decreased the chance of detection that it's not very surprising that this collision ended up happening. I believe it was also mentioned elsewhere that LIDAR is poor at detecting objects moving perpendicular to the sensors.
So yes, the technology failed, but at the same time the conditions that the pedestrian chose to undertake were ultimately conducive to that failure. Alas, regulations are always written in blood. It's real-world data like this that unfortunately tends to be needed for making improvements to both technology and legal approaches.
These cars are about to be the new elevator. Back in the early 20th century, most people did not like automated elevators and preferred to have someone cranking. Although the cranks were much more dangerous and there were accidents regularly, when one automated elevator had issues, it made national news and caused a lot of buildings to go back to the old crank for awhile with the attendant.
Here is a good summary of the situation in my opinion.
http://www.thetruthaboutcars.com/2018/03/uber-tempe-video/#more-1617988
So someone's been struck by a vehicle at night over in Englewood earlier tonight.
https://www.news4jax.com/news/pedestrian-struck-by-vehicle-in-englewood-area-police-say
Let's see if any comparison comes up between this accident and what happened in Arizona.
Nationally, the Jax incident won't make the news, so there's no comparison worth making. Locally, we've already filled these forums about how we can improve streets like Edgewood for multimodal safety and economic revitalization opportunities.
If there is enough light, to great distance, provided by the headlights of an automobile, an alert, sober, and capable individual provides a superb driving ability, with the result that it is almost impossible for the vehicle to hit anything that appears in front of the vehicle -- unless of course an object practically drops out of the sky onto the roadway.
Statistics might confirm that, the very fact that all drivers are not always alert, sober, and skilled to the extreme, ensures continued crashes, injuries, and deaths.
In my view, the sensory, perception, computation and reaction abilities of the alert and sober human far exceeds the current level of self-drive technology. At least several years more development are needed before the self-drive technology can be relied upon to even approach the abilities of the skilled, alert, and sober individual.
The current self-drive vehicles function on a level of a rather stupid, somewhat drugged driver. Before reliable, effective, and safe self-drive vehicles arrive, the self-drive systems must be improved to such a degree that the eventual types will barely be recognized when compared to the current rather dumb systems.
Sensory systems must be able to perceive instantaneously anything that exists or moves ... must be able to quickly assess the essence of the relationship between a perceived object and its environment ... with proper decision as to action.
Currently, the self-drive systems seem to some observers to be more effective than is the case, simply because they compare to the many drivers on the road who crash, injure, and kill as a consequence of not being alert ... of being drugged via alcohol or drugs, or of simply being stupid. We must compare any self-drive system to the competent individual in the first paragraph.
There are many individuals in our population who are able to get a driver's licence, but are continually dangerous on the road because are they are mentally slow or at least perform poorly when attempting to spatially engage the environment. These are individuals who we certainly would not want to be driving the cab we use, or flying the airliner in which we fly.
Certainly, the current self-drive autos might approach being as safe as the current population of near idiots on the road ... especially if these drivers are texting or drinking. This is not good enough.
Most of us will be impressed and satisfied with the self drive autos only when they have approached the abilities of the skilled, alert, and sober drivers ... those who never have had, or caused, crashes ... those who mostly prevent crashes that would have happened via one of the mentally slow drivers ... the near idiots who attempt to injure and kill every day.
The human system of perception, analysis, reaction, and action is much more complicated ... much more profound as to performance, than many realize. The current state of robotics or AI is quite effective and impressive when engaging scenarios of somewhat fixed and repetitive tasks, but when the variables and possibilities of events increase to extremes, such as when an automobile is maneuvered through the many variables encountered on a typical journey through high pedestrian and auto traffic, only the human system can perform with effectiveness and safety.
Of course, the current rather dumb self-drive systems can function safely if the maximum speed is held to perhaps 5 mph. How long with it be before a self-drive system can take an automobile safely at high speeds from one end of a city to the other?
Pretty good read... here is a snippet...
https://motherboard.vice.com/en_us/article/j5a8d3/self-driving-car-policy-uber
QuoteImagine you're in a self-driving car going down a road when, suddenly, the large propane tanks hauled by the truck in front of you fall out and fly in your direction. A split-second decision needs to be made, and you can't think through the outcomes and tradeoffs for every possible response. Fortunately, the smart system driving your car can run through tons of scenarios at lightning fast speed. How, then, should it determine moral priority?
Consider the following possibilities:
Your car should stay in its lane and absorbs the damage, thereby making it likely that you'll die.
Your car should save your life by swerving into the left lane and hitting the car there, sending the passengers to their deaths—passengers known, according to their big data profiles, to have several small children.
Your car should save your life by swerving into the right lane and hit the car there, sending the lone passenger to her death—a passenger known, according to her big data profile, to be a scientist who is coming close to finding a cure for cancer.
Your car should save the lives worth the most, measured according to amount of money paid into a new form of life assurance insurance. Assume that each person in a vehicle could purchase insurance against these types of rare but inevitable accidents, and then, smart cars would prioritize based on their ability and willingness to pay.
Your car should save your life and embrace a neutrality principle in deciding among the means for doing so, perhaps by flipping a simulated coin and swerving to the right if heads comes up and swerving to the left if its tails.
Your car shouldn't prioritize your life and should embrace a neutrality principle by randomly choosing among the three options.
Your car should execute whatever option most closely matches your personal value system and the moral choices you would have made if you were capable of doing so. Assume that when you first purchased your car, you took a self-driving car morality test consisting of a battery of scenarios like this one and that the results "programmed" your vehicle.
Nice article, but I imagine in the overall majority of cases, it's going to end up like how Mercedes chose, where the car will always prioritize saving its occupant because most people won't buy cars they know will consider killing them.
Sure, there are altruistic people out there who would sacrifice themselves to better society, but that's not representative of most people. At best, it'll probably be Option 5.
Or cars become sentient and seek to preserve their own "life" above all others.
The system slams on the brakes and veers hard left, causing the rear end to swing around and absorb the brunt of the damage, while 'protecting' the driver. But then releases the brakes 1/4 spin, cuts the wheel hard right and slams in to the left median. The sudden impact destroys the driver and passengers, but the crumple zones built into the car saves the CPU under the hood by forcing a rear impact.
Quote from: Non-RedNeck Westsider on March 25, 2018, 12:44:12 PM
Or cars become sentient and seek to preserve their own "life" above all others.
The system slams on the brakes and veers hard left, causing the rear end to swing around and absorb the brunt of the damage, while 'protecting' the driver. But then releases the brakes 1/4 spin, cuts the wheel hard right and slams in to the left median. The sudden impact destroys the driver and passengers, but the crumple zones built into the car saves the CPU under the hood by forcing a rear impact.
You need to take the blue pill, maaaaan.
You won't have to worry about this scenario happening for many many many many years, if ever. The amount of processing power that a car would need to have a reliable system of that capability is absolutely immense. The entire trunk of current AV's are taken up by computers as is, and you see how well they're doing. The amount of information required is also immense and verges on requiring a dystopian future or opens a can of worms into one.
Uber's AVs banned in Arizona for failure to comply with safety standards following fatal collision:
http://www.alphr.com/cars/1008913/uber-autonomous-cars-safety
^Good.
Just to reiterate a point that has been made often since this incident...google will guide you if you wish to read...
Uber is many years behind the leaders in AV tech. And given uber's compliance track record as a company, it probably deserves far more scrutiny than it has received to this point.