Enter the moral maze | Breakthrough Funding

Wednesday, October 31, 2018

Enter the moral maze

As self-driving cars look set to become a common sight on our streets, we’re speeding into a new world of ethical dilemmas

If a tree falls in the middle of the forest and there’s no one around to hear it, does it make a sound?

Erm… Who cares? There are some philosophical puzzles that are diverting to discuss at a dinner party, but which really don’t matter in the big scheme of things because they don’t have an impact on our daily lives. But with the rate at which technology is advancing and influencing the way we live, there are some moral dilemmas that must be faced.

This became apparent this week when the results of an MIT Media Lab study were revealed. Since 2014, their Moral Machine experiment has been collecting 40 million responses to a fairly classic ethical quandary. The questions they asked were based on the runaway tram conundrum, in which people are asked to choose between taking no action and letting the tram kill five engineers, or diverting the tram and killing one person instead. 

Updated for the modern world, MIT Media Lab’s dilemma asked people who should be sacrificed by a self-driving car, if it needs to make an autonomous decision in the event of an unavoidable crash. Should it be programmed to hit older people rather than younger people? Pedestrians rather than other vehicles? Animals rather than people? Poor people and criminals rather than successful entrepreneurs? 

What would you do? 

The answers probably feel impossible, when you actually imagine a real world scenario. Responses to the study varied greatly, with some geographical trends emerging. But this isn’t a hypothetical idea; in the coming years, these moral dilemmas will need to be considered – and answered – by the people putting driverless vehicles on our streets.

Are you working on autonomous technology? Please get in touch on 0800 772 0800 and share your story with us.

 

There's more you should know