I've recently read that the development of self driving vehicles is becoming a reality. People have even proposed that we will see driverless trucks on the roads within 10 years. Let's assume that all the technical obstacles are overcome and cars and tracks can get from A to B without driver intervention.
My question is an ethical one.
Scenario - A driverless truck is navigating a major road and a woman pushing a pram steps out in front of the vehicle. Would the smart truck avoid the pedestrian and sacrifice the truck and occupants to save the woman and child or would it, unable to stop, plow right through saving the vehicle and occupants?
So, the broad question is - Should a driverless vehicle give priority to others over occupants and how would you feel being in that vehicle?
I read in the paper last week that owners driverless vehicles may have the option of programming the vehicle to sacrifice to lives of the drivers and passengers in the event that it may cause harm to pedestrians and or other vehicles.
The vehicle will swerve away and may hit a tree or other obstacles to avoid harm to others and it is suggested that would happen in a normal situation the same as if a driver was in control of the vehicle.