Moral decisions
"Now the car has to decide whether to run into this truck and kill me, the driver, or to go up on the sidewalk and kill 15 pedestrians"
A nice easy boolean flag for moral decisions, "driver" defineable: SaveDriversLifeHasTopPriority
"Driver" of the "many outweighs that of the few" ethos will set flag to False
"Driver" of the I.m the most important thing in the universe mindset will set flag to True
Code could make "moral decisions" based on flag.
Though, in all seriousness, for something as complex as driving would expect some AI style code to be present, and as such quite hard to often know why AI software "makes a given choice" .