advertisement


Tesla ‘Autopilot’.

There are countless scenarios. Why is simply braking in a straight line the best one? It's very possibly what a human wouldn't do so this has already altered the outcome of some situations. This is not a tech/AI discussion, it's a moral and legal one. The legal view will no doubt win out.
In the particular situation you described, what other option is there but an emergency stop? Are you seriously suggesting that a human driver would not slam the brakes on or would have the time to choose who to run down?
 
Sure Cav. My point is really that distasteful ethical choices cannot be ignored by the designers. If an optimal outcome isn't achieved because it wasn't considered as a scenario, then unnecessary harm will occur and the designer may end up in court.
When it comes to emergency action when death is inevitable (which is where we started), one would hope that the programme and the law would start with the consideration "what would a human do". What a human would not do is take a second or two in deciding whether person A or person B should be run down. And if that is acceptable, and it generally is, why should we expect more of AI in that particular scenario?
 
I’ve participated in workshops with a couple of car OEMs, they went into a lot of detail about scenarios and then the lawyers got involved. The OEMs are usually very averse so they are very cautious about this stuff.
 
In the particular situation you described, what other option is there but an emergency stop? Are you seriously suggesting that a human driver would not slam the brakes on or would have the time to choose who to run down?
My thinking is that a human would brake and then turn the steering wheel just before the collision.
 
In the particular situation you described, what other option is there but an emergency stop? Are you seriously suggesting that a human driver would not slam the brakes on or would have the time to choose who to run down?
Actually sometimes you do. One of my near misses, which still has me sweating when I think about it, was on the M1 some 20-odd years ago. I was following a friend of mine, she'd invited me to dinner and said "follow me". Off she went at literally 90mph. I followed. We were zipping up Lane 2 and she moved into Lane 3. "Jesus" I thought, "She's speeding up" so I too moved into Lane 3 and put my foot down. Cue brake lights. I stepped on the brakes. This was pre ABS days so everybody concerned was standing on the brakes with locked wheels. I could see that I was about to go ploughing into into the queue of traffic, and by some miracle I had the sense to think that I *possibly* had a better chance if I released the brake, steered from Lane 3 to the HS and then braked again. So I did. I must have had a guardian angel because in releasing the brake I could steer, as soon as I got it headed to the HS I stepped hard on the brake again and drew to a stop unscathed, having passed about 2 or 3 stationary vehicles in Lane 1 on my right. Now I had no idea that he HS was clear, it was just a lucky guess. There could have been a stationary vehicle on the HS, I had no way of knowing. I did of course have a potential Plan C involving the banking, but none of this had been rationally thought through. But I did nevertheless choose, there was an element of what today would be termed a DRA (Dynamic risk assessment) because I was making reasonably rational choices on the hoof based on incomplete information. On that occasion I got away with it but it might just as well have ended in tears. I rolled the dice, reckoning that double 6 was unlikely. I was right, once.

Emergency services use DRA as a tool all the time. If they turn up at a fire or other disaster they can't wait for all the information to come in, they have to make decisions quickly without having time to do all the necessary computation. Obviously they cannot always get it right, as there is an element of probability involved. yo ucan't know whether the pedestrian is going to jump right, left or stay still, so what do you do? You have to guess best/worst/most likely and actaccordingly.
 
Actually sometimes you do. One of my near misses, which still has me sweating when I think about it, was on the M1 some 20-odd years ago. I was following a friend of mine, she'd invited me to dinner and said "follow me". Off she went at literally 90mph. I followed. We were zipping up Lane 2 and she moved into Lane 3. "Jesus" I thought, "She's speeding up" so I too moved into Lane 3 and put my foot down. Cue brake lights. I stepped on the brakes. This was pre ABS days so everybody concerned was standing on the brakes with locked wheels. I could see that I was about to go ploughing into into the queue of traffic, and by some miracle I had the sense to think that I *possibly* had a better chance if I released the brake, steered from Lane 3 to the HS and then braked again. So I did. I must have had a guardian angel because in releasing the brake I could steer, as soon as I got it headed to the HS I stepped hard on the brake again and drew to a stop unscathed, having passed about 2 or 3 stationary vehicles in Lane 1 on my right. Now I had no idea that he HS was clear, it was just a lucky guess. There could have been a stationary vehicle on the HS, I had no way of knowing. I did of course have a potential Plan C involving the banking, but none of this had been rationally thought through. But I did nevertheless choose, there was an element of what today would be termed a DRA (Dynamic risk assessment) because I was making reasonably rational choices on the hoof based on incomplete information. On that occasion I got away with it but it might just as well have ended in tears. I rolled the dice, reckoning that double 6 was unlikely. I was right, once.

Emergency services use DRA as a tool all the time. If they turn up at a fire or other disaster they can't wait for all the information to come in, they have to make decisions quickly without having time to do all the necessary computation. Obviously they cannot always get it right, as there is an element of probability involved. yo ucan't know whether the pedestrian is going to jump right, left or stay still, so what do you do? You have to guess best/worst/most likely and actaccordingly.
I dare say a competent AI would not have allowed that situation to occur in the first place.
 
I dare say a competent AI would not have allowed that situation to occur in the first place.
You would hope so, indeed. The whole point of AI is that it doesn't do the misguided and or just plain stupid stuff that flesh and blood drivers do. That's not the point. The point however stands, as you asked, that flesh and blood drivers can and do make decisions on the hoof and they can equally evaluate how successful this is, again on the hoof, and modify it accordingly. The fact that I would have been better placed to not be driving like a tw&t in the first place has not escaped me.

Your point was aimed at an AI driver making the least bad decision in a sticky situation once in it, we'd all hope that an AI would be smart enough to avoid getting into the sticky situation in the first place, and 9 times out of 10 they are. After all, a modern auto gearbox offers both better 0-60 and fuel economy than a manual box and a fleshy driver, because it's making better decisions.
 
You would hope so, indeed. The whole point of AI is that it doesn't do the misguided and or just plain stupid stuff that flesh and blood drivers do. That's not the point. The point however stands, as you asked, that flesh and blood drivers can and do make decisions on the hoof and they can equally evaluate how successful this is, again on the hoof, and modify it accordingly.
One other factor which I try to apply in appropriate circumstances is that whatever you do should, so far as is reasonably practical, be predictable. So in one of those dynamic situations, not doing something unpredictable gives those around you a better chance to make decent decisions too.

In that case, there's an argument for Cav's contention because if you know that the AI will brake in a straight line, you can factor that into decision making for the other vehicles involved too.

The arguments are rarely as simple as 'mum and pushchair in road vs old geezer on pavement' either. It might just as well be 'hit old geezer (with brittle bones) at 20mph, or young, fit mum, and child in a protective container, at 10mph.
 
not an AI question, but one of ethics. AI just implements the decision making
Agreed, you probably include this alongside your AI lectures.

As I said in another post, it’s not a technical issue, it’s moral and legal. Legalities will vary by country when we get into the fine detail. Our Highway Code will need to be built in as would the merge in turn practises in US and turn right on red lights in some states.
 
Agreed, you probably include this alongside your AI lectures.

we do, quite alot of research into explainable AI. For our undergraduates it can be useful to split out the ethical/moral dilemma from the computing/AI stuff. I deal with the former.
 
One other factor which I try to apply in appropriate circumstances is that whatever you do should, so far as is reasonably practical, be predictable. So in one of those dynamic situations, not doing something unpredictable gives those around you a better chance to make decent decisions too.

In that case, there's an argument for Cav's contention because if you know that the AI will brake in a straight line, you can factor that into decision making for the other vehicles involved too.

The arguments are rarely as simple as 'mum and pushchair in road vs old geezer on pavement' either. It might just as well be 'hit old geezer (with brittle bones) at 20mph, or young, fit mum, and child in a protective container, at 10mph.
It’s incredibly easy to snap the neck of a baby so if you’re thinking a 10 mph crash into a pram should be survivable you may be surprised. Babies in proper seats have died when cars have been side-swiped, the adults can be totally uninjured. There are so many scenarios to account for. We’ll get smarter at this but we will also make mistakes on the way.
 
One other factor which I try to apply in appropriate circumstances is that whatever you do should, so far as is reasonably practical, be predictable. So in one of those dynamic situations, not doing something unpredictable gives those around you a better chance to make decent decisions too.

In that case, there's an argument for Cav's contention because if you know that the AI will brake in a straight line, you can factor that into decision making for the other vehicles involved too.

The arguments are rarely as simple as 'mum and pushchair in road vs old geezer on pavement' either. It might just as well be 'hit old geezer (with brittle bones) at 20mph, or young, fit mum, and child in a protective container, at 10mph.
Don't forget to factor in the % chance of it happening too. After all in the chaos of a traffic collision there's a lot that you just don't know. What's the % chance of the guy jumping out of the way? Does that change if the car is coming directly for him? You bet it does. What's the % chance of the woman seeing the car coming and throwing the pushchair to the side? You just can't tell.
 
A few years ago I had a chance to ride in a fully autonomous car that the university where I work has developed.

It was a bit unnerving to let the car drive around, even though it was on a closed track, as it navigated obstacles and hazards on a very soggy day. But I gotta say, the 'puter was a better driver than a lot of people. No signs of road rage, engine revving, testosterone or impatience.

BLzf0yX.jpg


0wDrtSj.jpg


And it had more sensors and lights than the ship, but unfortunately no cool blippy bleep sounds.

Joe
 
Actually sometimes you do. One of my near misses, which still has me sweating when I think about it, was on the M1 some 20-odd years ago. I was following a friend of mine, she'd invited me to dinner and said "follow me". Off she went at literally 90mph. I followed. We were zipping up Lane 2 and she moved into Lane 3. "Jesus" I thought, "She's speeding up" so I too moved into Lane 3 and put my foot down. Cue brake lights. I stepped on the brakes. This was pre ABS days so everybody concerned was standing on the brakes with locked wheels. I could see that I was about to go ploughing into into the queue of traffic, and by some miracle I had the sense to think that I *possibly* had a better chance if I released the brake, steered from Lane 3 to the HS and then braked again. So I did. I must have had a guardian angel because in releasing the brake I could steer, as soon as I got it headed to the HS I stepped hard on the brake again and drew to a stop unscathed, having passed about 2 or 3 stationary vehicles in Lane 1 on my right. Now I had no idea that he HS was clear, it was just a lucky guess. There could have been a stationary vehicle on the HS, I had no way of knowing. I did of course have a potential Plan C involving the banking, but none of this had been rationally thought through. But I did nevertheless choose, there was an element of what today would be termed a DRA (Dynamic risk assessment) because I was making reasonably rational choices on the hoof based on incomplete information. On that occasion I got away with it but it might just as well have ended in tears. I rolled the dice, reckoning that double 6 was unlikely. I was right, once.
I had a similar experience on two wheels riding on a desolate country road travelling at about 120km/h. As I crested a hill, I saw an escapee cow slowly crossing the road, and she was right in my path of travel. There was no way I could have stopped in time, so I had one of two choices: Veer left or right. In a blink, I saw that the cow saw me, so I anticipated it would stop walking and I chose to veer right to her front. She did stop, and I missed her by about two feet. The drama lasted all of about five seconds.
 


advertisement


Back
Top