Select date

May 2024
Mon Tue Wed Thu Fri Sat Sun

Self Driving Cars And Moral Decisions - Who Will Live, Who Will Die?

27-10-2018 < Blacklisted News 44 375 words
 

The study found cultural and economic differences. People from Asian countries with a Confucian tradition showed a higher preference for old people to survive. Countries with a Christian tradition preferred younger ones more. People in Latin American preferred the survival of a male person more than people in other cultures. As an older male person in Europe I am not really comfortable with these results.


Inevitably the inclusion of such preferences in decision making machines will at some point be legislated. Would politicians regulate these to their own favor?


The people who took the test disfavored 'criminals'. Should the 'Moral Machine' decision be combined with some social scoring?


The Chinese government is currently implementing a social credit system for all its citizens. A person's reputation will be judged by a single number calculated from several factors. A traffic ticket will decrease ones personal reputation, behaving well to ones neighbors can increase it. Buying too much alcohol is bad for one's score, publicly lauding the political establishment is good. A bad reputation will have consequences. Those with low ratings may not be allowed to fly or to visit certain places.


The concept sounds horrible but it is neither new nor especially Chinese. Credit scores are regularly used to decide if people can get a loan for a house. Today's credit scoring systems are black boxes. The data they work with is often out of date or false. The companies who run them do not explain how the judgment is made. The U.S. government's No-Fly-List demonstrates that a state run system is not much better.


The first wave of the computer revolution created stand-alone systems. The current wave is their combination into new and much larger ones.


It is easy to think of a future scenario where each persons gets a wireless readable microchip implant to identify it. In a live-or-die scenario the autonomous car could read the chips implants of all involved persons, request their reputation scores from the social credit system and decide to take the turn that results, in sum, in the least reduction of 'social value'. The socially 'well behaved' would survive, the 'criminals' would die.


Would we feel comfortable in such a system? Could we trust it?


Print