- Six months ago, I wrote that “the REAL name of self-driving cars must become something like SOMT: Shared, On-Demand, Micro… TRAIN”. Today, I realized I should explain better a part of that concept, because I received on Twitter the following, sensible critique: “Well, by definition a train runs in predetermined courses. I find it hard to imagine how it could work in practice to have a transport that is both separated by pedestrians etc and does not run in predetermined courses”.
Uber believes that Self-Driving Trucks will result in MORE jobs for truck drivers, not less. Why, and what does this REALLY mean?
According to many people, a huge, if not THE main long-term problem with self-driving cars is how to write software that concretely “helps those cars make split-second decisions that raise real ethical questions”. The most common example is the self-driving car variant of the Trolley Problem: “your self-driving car realizes that it can either divert itself in a way that will kill you and save, say, a busload of children; or it can plow on and save you, but the kids all die. What should it be programmed to do?”
In my opinion, this way to look at the problem is greatly misleading. First of all…