I'm still waiting for the flying cars in every household we were promised in the 1950's. Also completely within the realm of the possible technically, yet...?
Whimsical aphorisms like Dylan's "The times they are a-changin' " aren't really substantial enough to defend autonomous airliners. Sorry.
What's ironic is that I WISH that fully autonomous flight were possible (in the sense of technically, economically, politically, psychologically viable, today, in 2015, not some vague time in the future) - yet some of us are forced to try and explain to the self-styled "futurists" how it's not quite as close as they assume.
-People who have done no market research, stating we "must admit there is probably market demand for such a thing right?".
-Pointing to self-parking cars, and airport trams, and the Space Shuttle missions and drawing direct comparisons to the commercial passenger flight infrastructure that move millions every day.
-Using fly-by-wire as an example of our trust in computers. When it's nothing of the sort - FBW is only there to act as an interface between the human operator, manipulating the machine either directly or through the selections of the autopilot. It does not possess any "intelligence" per se. We depend on the computers it uses to correctly send signals based on what it is directed to do - it does not make decisions in any meaningful way and we certainly don't "depend" on it to act autonomously or without error or failure.
-Equating technology whose failure causes inconvenience, with technology whose failure costs lives, and giving both equal weight.
And on and on with arguments that not only don't make logical sense, but can't really be refuted because they're based more on gut feelings formed from randomly collected perceptions. "Predator Drones! Mars Missions! OnStar! iPhones! We MUST be close to pilotless airliners! In 20 years! And if you ask me how I got 20 years, not 12, or 23, I'll just pull another answer out of my ass!"
Nat Geo recently had an article called "The War on Science" that articulates this growing phenomenon of basing one's beliefs on feelings rather than rational logic. It's kinda like we're heading into an "anti-Enlightenment" period, where either we reject science completely (anti-vaxxers, climate change deniers) or embrace it unquestioningly without acknowledging its limits. (*ahem*) Since I don't believe in endlessly fighting battles that aren't winnable, I'll bow out and let Rockie keep slugging away.
But I'll leave this thread with one last thought. It isn't about designing a computer or system of computers that's "reliable" enough. The failure rate of modern airliners is already astoundingly low, well below the threshold required to gain public trust. That isn't the issue. But even the most sophisticated aircraft flying are still primitive in the sense that they lack the ability for critical thought.
Currently, automated systems on commercial aircraft can't "think". Barring mechanical failure, they do exactly what they are programmed to do. I always laugh when people cry out for more automation after an accident that is blamed on "pilot error". Inevitably the accident was caused by the pilot ceasing to think critically: i.e. why did the Asiana pilot not question why it was requiring so much control back pressure to hold altitude on the approach to SFO? (As the airspeed decayed, with disastrous results). In other words, the accident could have been avoided if he acted LESS like a machine - mindlessly holding altitude, and more like a human and exercised the ability for critical suspicion.
So far, that ability, at its most sophisticated, is uniquely human. Closely mimicking it with brute processing power may work for non life-critical tasks, but it doesn't cut it for autonomous flight. If that changes one day in the future, the implications of having cognizant, self-aware machines are far more far more far-reaching than just aviation.
I’m still waiting for my white male privilege membership card. Must have gotten lost in the mail.