Truing up the philosophical conversations between Nick and Caroline around Alex, and otherwise polishing up the story. One more chapter to go there and then copyedit this weekend, then…publication. It’s a Dickensian pace I’m setting myself.
Jaron Lanier may be the other person I can send the book to. He had an article in the Times about the digital economy, in line with what I’m saying about Alex. He talks about “Siren Servers,” the supermassive databases used by finance and insurance companies among others to hoover up data about you that you often give willingly:
A hip trope holds that privacy is passé, but the loss of one’s privacy to a Siren Server means more than the loss of one’s credit card or Social Security number to a petty online thief. An ordinary person’s choices in music, friends, purchases, reading material and travels in the course of the day are just some of the streams of data that feed into algorithms that compare and correlate the activities of everyone being spied upon.
The motivation for the omni-ogling is that it leads to effective behavioral models of people. These models are far from perfect, but are good enough to predict and manipulate people gradually, over time, shaping tastes and consumption in more effective and insidious ways than even subliminal advertisements do.
MANIPULATION might take the form of paid links appearing in free online services, an automatically personalized pitch for a candidate in an election or perfectly targeted offers of credit. While people are rarely forced to accept the influence of Siren Servers in any particular case, on a broad statistical basis it becomes impossible for a population to do anything but acquiesce over time. This is why companies like Google are so valuable. While no particular Google ad is guaranteed to work, the overall Google ad scheme by definition must work, because of the laws of statistics. Superior computation lets a Siren Server enjoy the magical benefits of reliably manipulating others even though no hand is forced.
Yep, that’s Alex.
Also in Sunday’s paper, an article by Jonathan Safran Foer, who decries how technology drives us apart. Irony alert: He starts the article with a story of a girl he sees crying on the sidewalk.
A COUPLE of weeks ago, I saw a stranger crying in public. I was in Brooklyn’s Fort Greene neighborhood, waiting to meet a friend for breakfast. I arrived at the restaurant a few minutes early and was sitting on the bench outside, scrolling through my contact list. A girl, maybe 15 years old, was sitting on the bench opposite me, crying into her phone. I heard her say, “I know, I know, I know” over and over.
What did she know? Had she done something wrong? Was she being comforted? And then she said, “Mama, I know,” and the tears came harder.
What was her mother telling her? Never to stay out all night again? That everybody fails? Is it possible that no one was on the other end of the call, and that the girl was merely rehearsing a difficult conversation?
“Mama, I know,” she said, and hung up, placing her phone on her lap.
I was faced with a choice: I could interject myself into her life, or I could respect the boundaries between us. Intervening might make her feel worse, or be inappropriate. But then, it might ease her pain, or be helpful in some straightforward logistical way. An affluent neighborhood at the beginning of the day is not the same as a dangerous one as night is falling. And I was me, and not someone else. There was a lot of human computing to be done.
Then he goes on about how technology makes it easier to ignore the needs of others…and never does say whether he talked to her or not. So I guess the answer is that he didn’t, preferring instead to go home and write this article about how he could have if technology hadn’t made us less empathetic.