advertisement


Tesla ‘Autopilot’.

The thing with AI is that unlike humans it won’t panic or freeze when confronted with seemingly intractable ethical dilemmas.

tgns296updated.jpg


However, the many layers of a deep neural network, assuming it has sufficient and appropriate training data, will look at this dilemma and conclude that Tom Gauld is odd but brilliant.

Joe
 
Pull the lever of course.
Boneless giraffe and fried egg is particularly succulent and you get a barbequed lobster for free....
 
The higher levels need car-to-car communication but not just that in my view, it needs all cars communicating as a single rogue human driver could cause havoc.
We will probably see this by the end of the decade on dedicated and barricaded off motorway lanes.An easy to manage environment without pedestrians and two wheelers
 
We will probably see this by the end of the decade on dedicated and barricaded off motorway lanes.An easy to manage environment without pedestrians and two wheelers

I have heard suggestions that automated vehicles should have their own lane so not having to mix up with confusing behaving vehicles driven by humans.
 

A fascinating look at Tesla’s new ‘full self drive’ beta. Musk has clearly devalued the brand with his vile MAGA/QAnon and personal vendetta nutjobbery on Twitter, but there is still some fascinating technology here for whoever buys the company when it inevitably crashes and burns!

PS Marques Brownlee’s video above is entirely non-political. He’s a very good tech reviewer doing a very good real-world test of the beta.
 
Looks like new laws proposed will make the manufacturer liable for incidents rather than the driver which is good for the car owner if the car doesn't detect a cyclist in the dark or a pedestrian, as an example.

https://www.gov.uk/government/news/...g cars, coaches,self-driving vehicles by 2025.

""The legislation will build on existing laws, and state that manufacturers are responsible for the vehicle’s actions when self-driving, meaning a human driver would not be liable for incidents related to driving while the vehicle is in control of driving.""
That will help with sales if the driver is not liable for close passes, accidents or insurance claims.
 
What, you mean that in the future I could sit into a vehicle, and it would bring me where I wanted to go without me having to drive.
I would pay a good deal to be able to climb into the back seat of my car at hometime, get under a duvet and go to sleep until I arrived home.
 
Looks like new laws proposed will make the manufacturer liable for incidents rather than the driver which is good for the car owner if the car doesn't detect a cyclist in the dark or a pedestrian, as an example.

https://www.gov.uk/government/news/self-driving-revolution-to-boost-economy-and-improve-road-safety#:~:text=Some vehicles, including cars, coaches,self-driving vehicles by 2025.

""The legislation will build on existing laws, and state that manufacturers are responsible for the vehicle’s actions when self-driving, meaning a human driver would not be liable for incidents related to driving while the vehicle is in control of driving.""
That will help with sales if the driver is not liable for close passes, accidents or insurance claims.
I’d rather not own a car that might crash or kill someone, even if I were absolved of blame.
 
I’d rather not own a car that might crash or kill someone, even if I were absolved of blame.
The next legal stage will be can the passengers in the vehicle claim against the car make for their trauma/PTSD when someone is killed? Bear in mid the software may make a choice to collide with someone because it’s the least bad choice according the data available and the algorithm in use.
 
Tricky one. The problem is human drivers sadly do cause injuries and fatalities - so the benchmark isn't zero risk. That said, I think they should demand a standard far better than human average (difficult but of obvious benefit).
 
I’d rather not own a car that might crash or kill someone, even if I were absolved of blame.

You already do. Cars kill and injure thousands of people every year. As a cyclist who has ended up under a transit van I welcome the idea of good automation as it will never be drunk, stoned or distracted the way human drivers so often are. We are clearly still a long way off, but even as an always-on safety override for human stupidity I’d welcome it.
 
Tricky one. The problem is human drivers do cause accidents and fatalities - so the benchmark isn't zero risk, but human driver risk. That said, I think they should demand a standard far better than human average (difficult but of obvious benefit).
It is tricky, accidents might be fewer but will be different so the software may kill people who might not have been killed had a human been at the controls. Maybe we have to see it as we do immunisations - this is sort of ok as immunisations kill far fewer people with the virus would have. We won’t know with autonomous cars until they are fully deployed.
 
Critical is the ability for cars to all talk to each other so they know what every car around them will do. Hence initially car will likely be restricted to autonomous only routes. Having inconsistent humans mixed up with autonomous vehicles would be a recipe for disasters.
 
Why can't the driver disengage the autopilot by turning the steering wheel or using the brake pedal?
Seems a bit ED-209 to me....
 
You already do. Cars kill and injure thousands of people every year. As a cyclist who has ended up under a transit van I welcome the idea of good automation as it will never be drunk, stoned or distracted the way human drivers so often are. We are clearly still a long way off, but even as an always-on safety override for human stupidity I’d welcome it.
I don’t think I’d have killed that deer run over on open road by a Tesla.
And I often have to take evasive action for cretinous cycling that I doubt a self drive Tesla would.
 
It is tricky, accidents might be fewer but will be different so the software may kill people who might not have been killed had a human been at the controls. Maybe we have to see it as we do immunisations - this is sort of ok as immunisations kill far fewer people with the virus would have. We won’t know with autonomous cars until they are fully deployed.
In a situation where someone's death might be inevitable, and the situation arises so suddenly that the autopilot cannot avoid hitting anyone, these cars should be programmed to apply maximum breaking in a straight line - as most human drivers would do.

One would hope that these cars, which are constantly monitoring all element of road conditions and traffic, would avoid those incidents caused by inattention and poor driving technique of some human drivers. I doubt there are many situations where "inevitably kill him or kill her" actually arise.
 
Why can't the driver disengage the autopilot by turning the steering wheel or using the brake pedal?
Seems a bit ED-209 to me....
I’ve not read the detail on the beta software but…this is possible with today’s less sophisticated systems. The thing is that the more autonomous cars become then the more disconnected the “driver” becomes, they likely will not have a clue about the vehicles around them. Also over the time vehicles won’t have controls that humans can operate…for cost reasons and also to prevent human interference, which could be a good thing.
 


advertisement


Back
Top