Auto Body Language
An unaccountably large (for me) number of hits yesterday on the last part of my review of Sennett’s The Craftsman from 2009, which got me thinking about the whole “man vs. machine” thing again.
Looking back, I realize that the “Alex” novel (the AI novel idea that started this blog) was never going to fly – there was just no way to make a novel work when it starts with a woman talking to a computer for a hundred pages before any other characters really show up. On stage, I could see it, because it would be about the acting, the tone of the disembodied voice of Alex and the reactions of the lone woman onstage (at least for the first act; her quest to find the people who “made” Alex who he was would bring in more characters in act two). It would be a lot more fun for me as a play, too – sitting in a room alone writing about someone sitting in a room alone was just too damn depressing, whereas in the production of a play you get to interact with real humans.
But the reading and research I did was worthwhile anyway, as it changed my ideas about AI, and technology as a whole. I ended up coining (warning: delusions of grandeur) “Outland’s Fallacy.” The concept of Evil Robotic Overlords, the idea that AI will “take over and rule us all” stems from its believers’ primeval certainty that Intelligence + Opportunity to Gain Power = Guarantee Power Will Be Taken. It assigns to a computer the basest desires of Man, the “Will to Power,” as if the evolutionary struggle for dominance was hard-coded into anything with intelligence. It presumes a need for power that any “man-like object” would inevitably have to have. This atavistic fear of a powerful “other” goes back to our own subhuman days, when anything that was bigger than you and smarter than you and faster than you probably was seeing you as lunch. But computers don’t need to eat us to survive, or kill us to take our watering hole. What need would a computer satisfy by ruling us or killing us? What makes us think a non-biological entity would even have a survival instinct a la HAL? (P. W. Singer, author of the excellent Wired For War, illustrated this “Robophobia” fallacy pretty well here.)
In its most anti-scientific state it leans on the “Mad Scientist” trope – the brilliant cold mind without a “soul” who wants to take over the world, a Blitzkrieg of Grinchitude, the anti-technological message of the film Metropolis (at least in its edited version), so ably gutted by H. G. Wells.
In the end, this belief says more about the world views and desires of its proponents than it does about technology – “Everybody Wants to Rule The World,” of course, so why wouldn’t a computer – motive being a given in this world view, it needs only opportunity to make it so.
The whole “man vs. machine” debate is a moot philosophical point anyway. Man makes machines, and while some men use machines to rule others, it is still a man behind the machine. A hand on a joystick guides the Predator drone, a programmer writes the software that runs it, an officer gives the command to fire. To the people on the ground, it may be “indistinguishable from magic,” but the machine is only an extension of Man – it does what it’s told.
Kurzweil’s Singularity, the merging of man vs. machine (often called the Rapture of the Nerds but more likely to be the Rapture of the Hedge Fund Managers given the costs involved), is dreaded as some kind of obliteration of our “humanity,” but we have been merging with machines for a hundred years. Most of us do it every day, in fact, when we get behind the wheel of our cars.
You see people using cars as extensions of themselves, as expressions of themselves, every day. Not in terms of “I Am Rich” or “I Am Crunchy” but in the way they drive, in the way they express themselves through their driving. The little guy in the big truck who races to the stop sign, braking only at the last second, wants you to think maybe he won’t stop after all; the woman who goes 50 on the highway until you try and pass her, then speeds up; the jerk who absolutely has to pass you to get in to the exit lane, even though there’s nobody behind you for ten car lengths; the kid who makes a rolling stop at the intersection, kind of stopping but not completely as you try and cross the crosswalk, not entirely sure he will really truly stop in time and carefully watching not him but the rims of his wheels as they do or don’t stop…they are all using their cars to make a statement, gestures and actions primitive and mute (with the exception of the occasional shriek of a horn) but powerful and frightening nonetheless. The machine is an extension of themselves, they wear it and become one with it and use it to express their will to power or hesitancy or stupidity (people with deeply tinted windows who stop at the intersection when it’s their turn, possibly waving you through and baffled as to why you don’t see them waving).
Our machines are as “evil” as we are: they don’t run people over, they don’t fire themselves, they don’t push the button or pull the trigger. Technology in and of itself is atheistic and amoral, the danger is in the intent and the actions of the people who use it.