Autonomous cars: the AI will make horrible choices.
November 16, 2019


Fully autonomous cars that deal with all aspects of driving are called level 5 (L5 cars for short in the is blog).
The logic of basic self driving is reasonably easy to teach the car. The rules are to Follow the road, understand the lines and road Symbols, Know your directions and map, determine what objects are, match and predict object behavior to previous object patterns, avoid hard objects. Essentially one has to get from place a to b, follow the road rules, don't crash.
I don't mean to minimize the efforts, but these problems that are solve able with AI system that deal with vast amounts of data. Much of the advancement has been because of a speed of computer and sensor mixed with few small teams of people and relatively small amounts of money. We will make them better and better. They, eventually, will be near-flawless and it will be crazy to think we in control of a 70mph metal missile.
But more is needed that these rules. AI will need to make desicions NOT based on previous that I has not seen before. Like when sliding on ice, the trash can could be hit But the baby carriage should be avoided at all costs. Big data does not have availableBut the AI for decisions on boarder issues won't be solved for a long time. Indeed, asking AI to solve problems that seem clear to us ( keep people safe at all costs) May lead to totally Wong results (ok , then I won't start the car).
Given any understanding of computer programming it is clear that some problems cannot be solve with standard functional or procedural logic nor with the vast amounts of stats needed for basic driving. We today do t have a solution to determine for "which human to save"or a bush crash landing is better than a wall.
As a real world example of this problem: an accident where a Tesla ran into truck crossing the road. Why? possibly because there was no AI experience for the sideways view of a truck on a highway. It didnt know what to do - and the logic of "dont run into hard stationary object" got mixed with "crossing the road" and it "thought" it couldn't happen. I may have determined it was a sign in the distance.
What choice will the AI make if the oassengers vs a pedestrian on an icy day.
Will it understand that a deer should be hit but not a person bent down picking up a wallet? What about an the elderly driver vs a child.