The Disorderly March XV, or, Maslow’s Hammer
Yesterday’s Times biz section had an article on why Google has failed at its attempt to “reinvent” philanthropy, described by one professor as
“…a Googley idea that DotOrg would completely reinvent philanthropy and, in doing so, reinvent the world and address a hugely important set of problems with solutions only Google with its immense intellectual talent and resources could find.”
The problem was that Googlers weren’t interested in “soft” solutions to “soft” problems.
“We concentrated on complicated engineering problems rather than large development challenges,” said a former executive of DotOrg…“That meant we were creating solutions that were looking for problems rather than the other way around.”
Those solutions also had to be something that Google engineers, who represent the cream of the world’s elite universities, believed that only they could create.
For instance, in early 2008, some DotOrg staff members with traditional nonprofit backgrounds proposed a system to track drugs for diseases like malaria and tuberculosis through the supply chain, in order to combat drug counterfeiting and theft…The team’s idea was to engineer a FedEx-type system, relying in part on text-messaging, that would track drugs from the moment they left a manufacturer’s control until they reached a patient.
The plan never went anywhere, however, because text-messaging was not sophisticated enough to challenge Google’s engineers, several former DotOrg executives said. The culture clash between the engineers — caustically referred to by former DotOrg executives as “the Brahmin” — and those from development organizations was exacerbated by DotOrg’s leader, Dr. Brilliant, according to a dozen former employees of DotOrg.
This culture clash was fatal to the effort.
When the Google founders did attend a meeting about DotOrg, they spent most of their time fiddling with their BlackBerrys. At one meeting, former DotOrg executives said, they were stunned when Mr. Brin dropped to the floor and started doing push-ups…
Engineers from outside DotOrg were assigned to review all of the grants it had made. Not surprisingly, they wanted to know why there wasn’t more engineering in, say, a grant made to assess the quality of basic education in Tanzania’s schools. They also wanted to know why DotOrg wasn’t working more to “scale” up small projects to have a broader impact.
“They never understood that technology is a means to an end, and that in the developing world, sometimes basic technology, like the collection and compilation of data, can have enormous impact,” said another DotOrg program officer, who resigned after the reorganization…
“I believe DotOrg was under pressure to come up with what was called game-changing strategies,” said Professor Simon, who also serves as director of Brandeis’s sustainable international development programs. “They were looking for something like a new algorithm — but there isn’t any algorithm that’s going to eradicate guinea worm.”
This is the “fatal error” in the engineering mindset, so well put by Abraham Maslow:
“It is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail.”
Algorithms are “clean,” “beautiful,” and autistic. Google came into philanthropy with open contempt for “the process,” but sometimes a process is also a culture, and a culture is people. You don’t change a culture by openly despising its people, especially when they have given their lives to something far less lucrative than their Orderly March had qualified them for, by openly disrespecting them by dropping to the floor to do pushups in a meeting or by treating them like idiots because they didn’t have “mathy” solutions to problems such as how the legacy of theocratic-capitalist colonialism might make some third world people distrust the “we’re here to save you” messages of foreign aid agencies, or how primitive superstitions have to be accommodated and worked within to effect behavioral changes, or how tribal loyalties perpetuate genocides, or any of the brazillion other problems that can’t be easily quantified and which lead to so much of the suffering in the world.
This is why Google failed at social networking, and I believe it’s why, despite its its huge deposits of mineable data, it may not be able to create an AI that will be accepted by Non-Engineering Human Types (“Non-Googly”). It will create something that knows everything and can find anything, but can’t carry on a conversation at a human’s own level, can’t connect on the level that would get a non-engineer to actually trust that “entity” with her personal data, even her personal life, because these things aren’t “important.” They are engineerable, if you have a sense of what human beings hope for and fear from in the future of computing, what makes them trust each other and their machines, what can get them to cross the “uncanny valley” between man and man-like machine. But if you despise all that, and wave the banner that says “There is no God but the Algorithm, and Google is its Prophet,” you’re going to create something that’s absolutely perfect…for engineers, and then scratch your head and wonder why everybody else doesn’t want to buy the most beautiful hammer in the world.