Jerome McDonnell is an excellent radio interviewer on Chicago Public Radio. Without pushing any agenda, and with style, he allows people to tell a story - and lets the radio audience judge the content. Mr. McDonnell gives the interviewee plenty of time and space to share an agenda, and pulls the audience along to make the judgment McDonnell is looking for.
Listen to this interview from February 21. He interviews Steve Featherstone about the Pentagon, war and robots. Mr. Featherstone has an article in the current Harpers entitled: The Coming Robot Army. He is a free lance writer and has commented on NPR several times. There is a "gee whiz this is nifty stuff" quality about Mr. Featherstone's interview. You can tell from his voice and views he is enthusiastic about the robotic killers and secondarily mentions ethics. I don't think Mr. Featherstone has ever served in the Army, though he's delighted with winning wars as you will hear in the interview. You might think he is talking about a game he's playing. Therein lies the problem.
A robotic army is not a game, and as young generations of players become innured to war games, games of gore and mayhem -- how easy is it to make a transition to a robotic army. Indeed the Army starts looking for eighth graders as a point to begin working with robotic "assistants." Mr. Featherstone claims he himself is too old to be of any force in this man's army. The person controlling the robot may not even be in the same country. Mr. Featherstone tells us about robots that can now make decisions say, between a busload of children and a busload of "insurgents." I'm skeptical of that claim. A robot can tell the difference between a busload of insurgents and men and women just trying to survive in today's Baghdad? How can he utterly dismiss any doubts?
Mr. McDonnell asks why such robotic armies can't be turned against us by insurgents. Mr. Featherstone blithly answers: well why would they? They don't need to. Their strength lies in simpler methods. (paraphrasing). As though we dealing with a population of dunces. These people can adapt quickly as witnessed by their attacks. And robots could be stolen, copied. Would martyrs be as likely to do the killing if a robot could do it?
Please listen to the whole interview. I'd particularly like younger, gadget-friendly Kossacks to answer my concerns. Does the Army, for instance, consider ethics classes for its officers as to robotic warfare? Mr. Featherstone does mention in the interview that the generals are ambivalent about these robots in anything but ancillary jobs, guarding ammo houses for example. And in a stunning comment says how intelligent do you have to be to guard ammo anyway? A statement that if made by a senator would cause yet another fury.
HERE IS THE INTERVIEW:
http://www.chicagopublicradio.org/...
Justin Podur is a writer, translator and an activist living in Toronto. He has written for Z Magazine, Dollars and Sense, Frontline India, New Politics, and other publications. From an article in Z-net:
What I wonder though is whether the robotification of the army has limits. Does the complexity and expense of the organization of an army that uses robots heavily create vulnerabilities? Is such an organization good at some things and not others? And, leading into Luttwak's article, given that no military can stand against the US military and we're talking about an army that will be fighting relatively defenceless populations, what are the effects of using such an army on a population?
Bingo! The "population." Does it seem cleaner to an occupying army and its people to have robots kill non-combatants, so that soldiers, the population of the nation itself, ostensibly have no blood on their hands? As we distance ourselves from the horror and reality of war through the robot army, do we lose our own humanity? And what about that population? How easy is it to turn those robots against us? Robots never make mistakes - never go amok? Once again I'm skeptical. Mr. Featherstone says: well, we got used to nuclear weapons - we'll get used to this. I never got used to nuclear weapons. I often have to work hard to turn the nuclear weapon stuff out of my head so I can function. My generation has had to absorb so much, and now this. And as always - ethics come dragging along as an afterthought like a piece of toilet paper stuck to an Oscar Winner's shoes.
I enjoy science fiction even knowing it is less likely to remain "fiction" for very long. But as the lines of fiction and reality blur dizzingly and at warp speed through, above and into my own life -- I am both fascinated and scared. But mostly scared.