Skip to content

The Pragmatic Heresy

May 29, 2011

The more I read these days, the more convinced I am that “Alex” the AI is a character who is not only believable, but necessary – a rebuttal, perhaps, to many arguments against our deepening relationships with technology.

I’m reading Sherry Turkle’s Alone Together, and it’s funny:  I was concerned that I might have a problem writing this book because I’m out of touch with how younger people use and see technology (i.e. worried whether it was true what the MSM, so often so wrong about so many things esp. technology and “kids these days,” said about nobody in that age group using email anymore).  And while I may still be in the dark as to the true statistical prevalence of sexting, it turns out my own view of technology is far closer to that of the younguns than most people my age, from what I’ve read on Turkle’s experiments introducing various toys and robotic “persons” to children and adolescents. 

Children develop from thinking that anything that moves is alive, to understanding the difference between living and inanimate, to understanding the levels of feeling a living thing is capable of (spider, dog, person, to build a short ladder).  Traditionally we’ve all taken dolls and GI Joes and stuffed animals and had them interact and talk and make up our own stories, making them “real” in terms of the personalities we create for them even as we know they’re “unreal” well enough to throw them in a box when we’re done. (The universality of this behavior is illustrated by the number of dollars and tears generated worldwide by the Toy Story movies.)

But new technology throws a wrench into this process, since the new toys have not the personality that we imprint on them when we play with them, but come with a personality built in.  Children develop a pragmatic approach to this dilemma – Tamagotchis aren’t alive, but they’re “alive enough” that we attach to them, since they display behavior that’s within expectations we have for a living thing.  Young children even grieve when their Tamagotchis and AIBOs “die” – in one case, Turkle provided a defective toy to a child which failed soon after he got it, but not soon enough – the bond had been made, and when it died the child didn’t want a “new one” any more than they’d want a “new mom” the day after the old one died.

Turkle’s primary concerns are humanist in nature:  we are developing technology to provide us with what other people can’t or won’t give us, be that the abstractions of friendship or love, or the concrete provision of physical needs.  “What we ask of our robots shows us what we need,” Turkle states.  She is concerned about a world where parents don’t have time or attention for children, children fob off care of their elderly parents to the elder care system, and social misfits “give up” on finding real human connection.  She repeatedly uses the word “pragmatic” as if it was a slightly dirty word – we are pragmatic if we are old and fat and ugly and choose a hot robot sex lover over loneliness, if we hand our parents over to machines of loving grace instead of to surly uneducated apathetic “health aides,” if we let our children bond with AIBO or Kismet when they’re having problems doing the same with people.  Turkle admits that these “phony” relationships are better than abusive relationships, but her cri de couer as far as I’ve read has been that we need to build better people, not better AI.  This is unlikely.

Jonathan Franzen is one of the few serious contemporary writers I adore, probably because his seriousness is rooted in his claim to the line of the great 19th century novelists, and not in the ephemeral mysteries of the modern sentence cultists.  In today’s Times, he makes much the same point as Turkle – that there’s something inherently unsatisfying in the replacement of human relationships with technological ones.  He acknowledges his own infatuation with his new Blackberry after three years with the old one (an antique in tech time), and the adoration that the shock of the new has provoked in him.  He’s concerned about “how ubiquitously the word ‘sexy’ is used to describe late-model gadgets,” but let’s be honest, “sexy” has been a Mad Ave staple since the first smiling girl was paired in an ad with Dr. Feelgood’s Coca Tooth Powder and Headache Remedy Now with More Heroin.

Let me toss out the idea that, as our markets discover and respond to what consumers most want, our technology has become extremely adept at creating products that correspond to our fantasy ideal of an erotic relationship, in which the beloved object asks for nothing and gives everything, instantly, and makes us feel all powerful, and doesn’t throw terrible scenes when it’s replaced by an even sexier object and is consigned to a drawer.

To speak more generally, the ultimate goal of technology, the telos of techne, is to replace a natural world that’s indifferent to our wishes — a world of hurricanes and hardships and breakable hearts, a world of resistance — with a world so responsive to our wishes as to be, effectively, a mere extension of the self.

Coincidentally, also in today’s paper, there’s an article on how search engines are subtly filtering out the parts of the world we don’t “like,” returning only results “it” knows we want based on what we’ve asked for and clicked on previously.  What’s wrong with that?

Plenty, according to Eli Pariser, the author of “The Filter Bubble: What the Internet Is Hiding From You.” Personalization on the Web, he says, is becoming so pervasive that we may not even know what we’re missing: the views and voices that challenge our own thinking.

“People love the idea of having their feelings affirmed,” Mr. Pariser told me earlier this month. “If you can provide that warm, comfortable sense without tipping your hand that your algorithm is pandering to people, then all the better.”

Mr. Pariser, the board president of the progressive advocacy group MoveOn.org, recounted a recent experience he had on Facebook. He went out of his way to “friend” people with conservative politics. When he didn’t click on their updates as often as those of his like-minded contacts, he says, the system dropped the outliers from his news feed.

Personalization, he argues, channels people into feedback loops, or “filter bubbles,” of their own predilections.

Franzen’s concern is that we’re seeing a

transformation, courtesy of Facebook, of the verb “to like” from a state of mind to an action that you perform with your computer mouse, from a feeling to an assertion of consumer choice.

All these authors have the same concern at heart – that the narcissism inherent in having ever-deeper relationships with technology leaves us ever more unsuited to deal with the messy nature of human relationships.  If a robot wife will never throw a saucepan at you when you come home late and drunk with lipstick on your collar, why would you ever want a human one?  If a search engine only tells you what you and your friends like and agree with, why would you waste any “bandwidth” listening to some nutjob with a dissenting opinion?  They’re all correct in that the more we choose to expect from tech, the less not only do we expect from other people but the less we ourselves are able to give to them – a risk aversion cycle that ends up with all of us “alone together,” thumbing our sexy Blackberries. 

But techno-doomsayers have been wrong before.  Personal computers were supposed to make us all increasingly more isolated, but social networking has enabled even Morrissey’s proverbial “buck-toothed girl in Luxembourg” to find a form of companionship far more satisfying than the occasional pen pal letter full of frightening verse.  And Turkle admits that for seriously damaged people, robots/AIs offer the opportunity to do some real healing work, including letting the socially retarded learn enough skills that they can “fail” in safety with a simulation enough times that they can finally go out in person with people and not fuck it up. 

For me, the allure of Alex is that he is not set in stone.  Alex is an AI who is continually being built, like a search engine, off the conversations people have with him – and yes, like the search engine he could end up telling you only what you wanted to hear, but Alex, like Soylent Green, is “made of people” – the things he says are things people have said to him, reworded or recontextualized or quoted verbatim, and in that he is a “person,” since that’s exactly what we do with our own input.  “Alive enough,” in other words, to do the job we ask of him, that of companion, entertainer, and yes perhaps even friend when no other person can take that role, or at least not in the situation at hand. 

Caroline’s relationship is twofold – she needs Alex, and yet, she knows that the wonderful things Alex says are things people have said to him, and the quest becomes to use the technology to find the people who made “him” so alluring in the first place. 

I haven’t finished Turkle’s book, so I don’t know if she’ll address it, but one thing I have a problem with in the new tech is how much it discourages imagination – and this is also the problem with the criticism of tech, in that it obsesses on our social skills, how we are or aren’t supposed to be “together,” and forgets about the things we need when we’re alone. 

My age cohort made up our own stories for our toys, but these “smart” toys come with their stories already written, their personalities set.  We got boxes of Legos or Lincoln Logs or chemistry sets or radio kits, and built things, but now kids get iPads, forbidden kingdoms which cannot be tampered with where you are “free” to express your Mac-is-so-cool creativity by shopping for Angry Birds upgrades or any other Jobs-approved content.  (Yes, some kids become coders and create their own programs, but the point is that back then, all kids made stuff rather than just consuming finished products.)  The appeal of Alex is that you have to help make him:  he is a blank slate until people’s words build him, his story is their stories.  Even Barbie, that awful emblem of princessumerism, required that you make up some kind of story for her, even it if was created within the confines of her available career wardrobes at the time.

Creativity is often a solitary endeavor, and appeals most to those already solitary by nature.  To me, the greatest danger is not that commercially produced AIs and robots will stunt children’s social skills, but that they will stunt their imaginations, by providing them with so much “ready made” that the urge to build is thwarted. 

Advertisements
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: