I've recently read that the development of self driving vehicles is becoming a reality. People have even proposed that we will see driverless trucks on the roads within 10 years. Let's assume that all the technical obstacles are overcome and cars and tracks can get from A to B without driver intervention.
My question is an ethical one.
Scenario - A driverless truck is navigating a major road and a woman pushing a pram steps out in front of the vehicle. Would the smart truck avoid the pedestrian and sacrifice the truck and occupants to save the woman and child or would it, unable to stop, plow right through saving the vehicle and occupants?
So, the broad question is - Should a driverless vehicle give priority to others over occupants and how would you feel being in that vehicle?
Tamey said
12:53 PM Jul 2, 2016
I read in the paper last week that owners driverless vehicles may have the option of programming the vehicle to sacrifice to lives of the drivers and passengers in the event that it may cause harm to pedestrians and or other vehicles.
The vehicle will swerve away and may hit a tree or other obstacles to avoid harm to others and it is suggested that would happen in a normal situation the same as if a driver was in control of the vehicle.
dorian said
01:16 PM Jul 2, 2016
Isaac Asimov's "Three Laws of Robotics:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
-- Edited by dorian on Saturday 2nd of July 2016 01:17:46 PM
Aus-Kiwi said
01:28 PM Jul 2, 2016
There's a few out there .. In Prado's. Lol
Delta18 said
01:01 PM Jul 3, 2016
A fellow in USA a couple days ago became the first to be killed in a MVA in a Driverless Vehicle.
His car didn't notice a white truck turning across his path with a bright background.
I've recently read that the development of self driving vehicles is becoming a reality. People have even proposed that we will see driverless trucks on the roads within 10 years. Let's assume that all the technical obstacles are overcome and cars and tracks can get from A to B without driver intervention.
My question is an ethical one.
Scenario - A driverless truck is navigating a major road and a woman pushing a pram steps out in front of the vehicle. Would the smart truck avoid the pedestrian and sacrifice the truck and occupants to save the woman and child or would it, unable to stop, plow right through saving the vehicle and occupants?
So, the broad question is - Should a driverless vehicle give priority to others over occupants and how would you feel being in that vehicle?
The vehicle will swerve away and may hit a tree or other obstacles to avoid harm to others and it is suggested that would happen in a normal situation the same as if a driver was in control of the vehicle.
Isaac Asimov's "Three Laws of Robotics:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
A robot must obey orders given it by human beings except where such orders would conflict with the First Law.
A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
-- Edited by dorian on Saturday 2nd of July 2016 01:17:46 PM
A fellow in USA a couple days ago became the first to be killed in a MVA in a Driverless Vehicle.
His car didn't notice a white truck turning across his path with a bright background.
More mods required I think.