Skip to content

Wired for War

March 30, 2009

Ahh, a nice restful weekend.  Finished both P.W. Singer’s Wired for War and Irene Pepperberg’s  Alex and Me, and of course Alex will get a full post later this week.  Wired is definitely worth a review, so here it is.  It’s a massive catalog of every current and many future battlebot technologies, from Predators to IED killers, and has a great explication on how the “Rummy Doctrine” of minimal manpower and maximum technology was formulated and implemented in Iraq. 

The key to what was called “network-centric warfare” was the shift to the new information technologies of computers, the Internet, fiber optics, and so forth, which allowed a enhanced level of communication and information sharing…this would infinitely speed up the pace of operations, they argued.  Soldiers and generals a continent away would look at the same image online, at the same time…they could operate with a speed and cohesion that would “dramatically increase force efficiency”…With the “fog of war” lifted, and the “system of systems” working to perfection, fewer forces could be sent into battle and they could be lighter, quicker and more decisive…”Today,” Rumsfeld stated, “speed and agility and precision can take the place of mass.”

The problems were first, that while

The U.S. forces were all networked together, with a “blue force tracker” letting them know the position of all the friendly units…the only problem is that they still didn’t know who the enemy was or when or where he was coming…as a Marine joked, “When do we get red force trackers?”

And secondly,

Rather than a seamless flow of information, soldiers wrestled with everything from Web browsers constantly crashing due to desert sand to heat fouling up equipment designed for use in offices, not battlefields…one Army lieutenant resorts to navigating a convoy using an improvised GPS and some handheld walkie-talkie radios that he had bought from a hardware store back home.

Rummy’s logistical idiocy wasn’t limited to providing inadequate body armor, either:

The most widely used power source in the military is the BA 5590, a standard twelve-volt battery that powers everything from radios to antitank missiles…the demand for the batteries turned out to be much higher than ever planned (the marines alone were using up 3,028 of them a day).  But there were no stockpiles…the only reason the plug wasn’t literally pulled on the Iraq invasion is that thirty other nations loaned the United States extra batteries.  Ironically, many of these nations were the very same ones from “old Europe” that politicians like Rumsfeld had lambasted during the “freedom fries” period…

What was most interesting to me were the psychological issues Singer explores about man and technology.  There’s a great deal about “the loop,” which is the fluctuating amount of authority the human has once the machine has been set in motion. In a rocket defense system, for example, you want the machine to react without waiting for human input, as seconds count when those rockets are screaming towards human targets.  Whereas in a bombing system, you want the human to verify that the target isn’t a wedding party or a Canadian platoon.  Singer offers up horror stories of technology gone Terminator on us, including a computer-controlled antiaircraft gun in South Africa that went berserker and killed nine and injured fourteen, and a naval radar system that mistakenly targeted and shot down an Iranian civilian airliner in 1988.   People were in the loop in the latter case, but though

[T]he jet was on a consistent course and speed and was broadcasting a radar and radio signal that showed it to be civilian…the computer system registered the passenger plane with an icon on the computer screen that made it seem to be an Iranian F-14 fighter…even though the hard data was telling the crew that the plane wasn’t a fighter jet, they trusted what the computer was telling them more.

So in this case, the “human error” was accepting the computer as infallible.

Singer says that many of the problems we’ll face in the near future with our automated armies come from a lack of “doctrine.”  The British, for instance, introduced tanks into World War I, but after they broke through German trench lines, there was no plan for what to do with them next, and the offensive stalled.  In our current state, the military is spending billions on new technology and throwing it into the field willy-nilly, with no master plan on how to coordinate the tech being “implemented.” The mandate is to buy the latest and greatest, but there’s little discussion in the armed forces on anything from the support logistics to the ethics of remote killing.

There’s also a clear opening for massive failure in robowarring, especially in insurgent situations.  A six-year-old with a can of spray paint can disable a multimillion-dollar piece of equipment; a Taliban fighter equipped with engineering skills (and engineers, alas, seem to have an unfortunate vulnerability to extremism) could reprogram a captured robot to kill American troops, or kill civilians in a way in which America could be blamed.  Moreover, Singer reminds us, in many of the furious moustache nations in which we’re fighting for hearts and minds, there’s a culture of honor that sees a man who sends a machine into battle in his place as “cowardly,” giving the psychological ops advantage to the insurgents, who can look down on the enemy who won’t “come out and fight” to risk his own blood.  And the terrorist mentality, as we’ve seen from the execution porn they are so fond of making, needs to kill the enemy on a regular basis to sate its bloodlust; if there are no live bodies on the battlefield to kill, Singer suggests, they may redouble their efforts to attack civilians here and abroad. 

The “shock and awe” that a platoon of unstoppable robots could deliver to the enemy is indeed overwhelming, but as Singer notes, the longbow, the catapult, the cannon and the rifle in their own time were at first shockingly effective (and decried for their “dishonorable” effect of safeguarding their operator from the “300” way of battle), but eventually the enemy adjusts, quickly procuring the same technology for themselves.  The irony is that as the commercial robotics world works to make robots acceptable, with human or animal forms and characteristics, the military robotics world is trying to make them more frightening, more monstrous or insectile.

Singer and many whom he interviews are worried that “remote control” war, in which an operator in Nevada kills insurgents all day and then drives home and goes to a PTA meeting, will dehumanize both soldiers and the civilian populace who consents to war, making war more likely when casualties are guaranteed to be minimal.  It also makes for a disconnect between the boots on the ground and the eyes in the sky when one of them is safe at home and the other is in the line of fire – think of the scenes in Body of Lies where Leo DiCaprio is in the shit in the scariest parts of the Middle East and Russell Crowe as his handler is giving him instructions over the phone from his lawn over morning coffee, from his kid’s soccer game, from all manner of disconcertingly normal situations which make it impossible for him to see or empathize with what’s happening on the ground.  Singer also posits scenarios in which literal armchair generals, and colonels, and admirals, all have access to live feeds from the sky on real-time combat, and therefore all have the opportunity to issue a stream of orders which may be terribly misguided, reflecting as they do the limited intelligence they actually have from that vantage point.

Most germane to my project were the sections on robots in Japan, and the attachments soldiers make with their robots.  As he discussed in his Jon Stewart appearance, the Japanese have a dramatically different approach to robots than we do.  Whereas in the west we see them as Terminators,

To this day in most Asian science fiction, especially in the anime genre, the robot is usually the hero who battles evil. This has heavily influenced both Japanese scientists and that nation’s culture…Japan’s traditional religion of Shintoism holds that both animate and inanimate objects, from rocks to trees to robots, have a spirit or soul just like a person.  Thus, to endow a robot with a soul is not an illogical leap in either fiction or reality…Buddhism also makes for a more soulful approach to what a westerner would see as just a tool or maybe a mechanical servant.  [One Buddhist author] argues that robots can have a Buddha-like nature and that humans should relate to them as they would to a person.  “If you make something, your heart will go into the thing you are making.  So, a robot is an external self.  If a robot is an external self, a robot is your child.”

Pretty much the argument I plan to make on why Caroline attaches to Alex the way she does.

The whole section on soldiers’ attachment to the robots who save their lives every day in Iraq by destroying IEDs is worth reading.  “Scooby,” a robot who’d hunted down and defused 18 IEDs, had finally met his match.  The soldier who brought him in to the robot hospital for repairs was in tears.  “Can you fix it?” he asked, and when told the answer was no, “the news left the soldier ‘very upset.’  He didn’t want a new robot, but ‘wanted Scooby-Doo back.’”

An affinity for a robot often begins when the person working with it notices some sort of “quirk,” something about the way it moves, a person or animal it looks line, whatever…Pretty soon, it feels natural for the person to give the robot a name, just like they would for another living thing, but not what they would do for most machines…iRobot has found that 60 percent of Roomba owners have given names to their robot vacuums…like the Roomba owners, the soldiers know their robots are not alive…and yet these soldiers are experiencing some of the most searing and emotionally stressful events possible, with something they would prefer not to see as just an inanimate object…To view the robot that fought with them, and even saved their lives, as just a “thing” is almost an insult to their own experience.

There’s a lot more to this highly recommended book, including speculations on the effect “wired war” has on the troops who fight it, the resistance from many robotics scientists to the weaponization of their work, and potential liabilities under international law (i.e. just who is violating the Geneva Convention when a robot kills a village – the programmer, the remote operator, the unit commander, the spotter?).  Given Singer’s previous Defense Department work, with any luck his book is being studied there now.

Advertisements
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: