The Ethics of Self-driving Cars
Self-driving cars are already cruising the streets today. And while these cars will ultimately be safer and cleaner than their manual counterparts, they can’t completely avoid accidents altogether. How should the car be programmed if it encounters an unavoidable accident? Patrick Lin navigates the murky ethics of self-driving cars. Patrick Lin navigates the murky ethics of self-driving cars.
After watching the video on the ethical dilemma of self-driving cars, use the discussion questions to investigate the issues that are raised.
- In the situation described, would you prioritize your safety over everyone else’s by hitting the motorcycle?
- In the situation described, would you minimize danger to others by not swerving? if so, you would hit the large object and potentially die?
- In the situation described, would you take the middle ground by hitting the SUV since it’s less likely the driver will be injured? Compare this to hitting the motorcycle.
- What should a self-driving car do?
- What is the difference between a ”reaction” (human driver’s split second response) and a “deliberate decision” (driverless car’s calculated response)?
- Programing a car to react in a certain way in an emergency accident situation could be viewed as premeditated homicide. Do you think this is a valid argument? Why or why not?