January 17, 2016
By now it is no secret, within the next few years, traditional cars will start to be challenged by their robotic brethren; Autonomous Autos. Some of those on the production side of the equation; Elon Musk; says that the introduction of all out self driving cars will make not only traditional cars but traditional drivers obsolete. To be fair, the autopilot feature on the Tesla Models S and X, already drives better than the average octogenarian (and probably most teenagers too). Musk claims that Autonomous Autos drive far superior to people; that is why he claims that physically driving a car will likely be made illegal in the future. The problem is, people are erratic. The smallest annoyance could turn doctor Jekyll into Mr. Hyde. I have seen people turn livid when a stranger does not use a signal before changing lanes… even when the lane entered was not their own to begin with (not that anyone could actually possess public property), by that I mean the miscreant went to the far left lane while the complainant was in the far right. I have also seen the very same protestor not use their blinker, and actually cut someone off. Point is, people can also be hypocritical. That; is the very essence of Mr. Musk’s assertion. People will actually do things that they know they should not. Just like diabetics eat sugar, drivers run red lights and stop signs, turn without signaling, tailgate, drunk-text-and-drive, speed, all while driving a car with a headlight out and expired inspection sticker. This because people have been given the gift rationalization. Those guys over there doing (X) are monsters, but I know I am not evil. So my doing (X) now is not really deplorable; I am just in a hurry, otherwise I would never ever do (X). The next day: I did (X) because I was in a hurry yesterday and it did not hurt anyone, so it is ok for me to do (X) today. Autonomous Autos are so prevalent in the future, specifically because they were spared that gift. The only thing dictating what an autonomous auto does is literally logic, there is no why[?]. (For now) Self-driving cars are given super specific non-negotiable boolean demands. For example, in this situation (if X) the car will do (Y and only Y), i.e. if the light turns yellow, the car will slow to a stop, in no scenario will the car speed up to catch the light. Eventually, boolean commands will be supplemented by deep learning, like the hackermobile. Comma.ai (George Hotz) made a car that learns how its driver and other cars on the road behave, and then expertly mimics them. This technique has two flaws. One: it does not factor in the teacher (terrible teacher; poor pupil) and two: in the case of a unique emergency the car would have no idea how to respond. However, that is exactly how we teach our human drivers today, and not a soul has any problem with that. If it is acceptable for people (who are less adept to begin with) to learn that way I see no problem with letting autonomous autos do the same. In fact, it would create an interesting dynamic (that Google is pushing for) in which the teacher has onus over the driver (self driving car). Granted, people are more nuanced than machines, so teaching them a certain way may not be exactly equivalent (people choose to ignore sound advice all the time), but it is still a fair comparison.
Irrational actions cause accidents. If everyone were to follow the rules exactly at all times, there would never be any traffic (on the highway at least (stop signs and street lights (especially those that go out) will always create a chokepoint)), nor would there be any accidents. Why would anyone want to avoid future like that; autonomous autos will shrink the world in as close to a literal sense as possible.