Less Than a Person and More Than a Dog
…was the title of the now-abandoned novel on AI that started this blog. Sherry Turkle of MIT has written a book that covers a lot of what I was thinking when I started it, though her perspective and mine are different. From today’s Times book review by Jonah Lehrer:
Turkle begins with the troubling observation that we often seek out robots as a solution to our own imperfections, as an easy substitute for the difficulty of dealing with others.
Just look at Roxxxy, a $3,000 talking sex robot that comes preloaded with six different girlfriend personalities, from Frigid Farrah to Young Yoko. On the one hand, it’s hard to argue with the kind of desperate loneliness that would lead someone to buy a life-size plastic gadget with three “inputs.” And yet, as Turkle argues, Roxxxy is emblematic of a larger danger, in which the prevalence of robots makes us unwilling to put in the work required by real human relationships. “Dependence on a robot presents itself as risk free,” Turkle writes. “But when one becomes accustomed to ‘companionship’ without demands, life with people may seem overwhelming.” A blind date can be a fraught proposition when there’s a robot at home that knows exactly what we need. And all she needs is a power outlet.
When it comes to “Love and sex with robots,” we’ve had er um external marital aids of one form or another for millennia if not forever – what’s new about a Roxxxy is only in its improvements to long extant tech. I’m hard pressed to see a danger from a dumb object, except in the cases of those who, childlike or childishly take your pick, transfer a personality to it, and those are people who would probably do the same with their collection of Malibu Staceys.
The “danger” hasn’t arrived yet – for now, you can’t get stimulating conversation and companionship (as opposed to sex) from “a machine,” but that day is coming. The real danger will come from the dependence we’ll have on the companies that “manage content” for these soon-to-be intelligences. As I was going to propose in the novel, who owns the content that makes “artificial intelligence” possible? Giant server farm owners like Google, who have the data and the capacity, at least in the abstract, to generate a conversation on any subject in the world, could create a “companion” who wouldn’t be saying anything original, but then again there is nothing new under the sun, right? And the key is not that its words and phrases are new and original, but that they’re new to you. If that intelligent agent was so charming, seductive, interesting, smart, informative and amusing that its conversation surpassed that of most people (not that difficult, really), how would you ever tear yourself away? And then, once they’ve hooked you, what happens when they charge you more and more money for that experience – what happens when “buying friendship” comes true to the point where you would pay anything not to lose your new best friend? The programmers who can make that come true will make Facebook look like a penny stock.