The question lies in should we adopt a utilitarian approach, where we simply look at numbers and say 100 deaths < 1000 deaths? This is a very computerised approach but if statistically we are less likely to die in an autonomous car, should it matter if its a computer fault, or driver fault? As a human being i'd personally prefer to drive my own fate but then a part of me likes the idea of autonomous cars, and if done right, the computer would never put itself in a scenario where it needs to choose 'who to kill'.
Some great discussion going on here thanks!!, Im actually looking into how autonomous cars be successfully implemented in today society for a university project, if anyone would like to help me I have a survey asking a few more questions it would be great if some of you could complete it, it's as long as you make it.
https://www.surveymonkey.co.uk/r/NWDXKK8