Play Live Radio
Next Up:
0:00
0:00
Available On Air Stations
Watch Live

Science & Technology

Pros and Cons of Robotic Warfare

Pros and Cons of Robotic Warfare
As part of our monthly series on ethics and technology, we'll look at whether robotic warfare makes the world a better place.

The next Ethics Forum: "RoboWarfare. Is the World a Better Place When Robots Fight Our Wars for Us?" is Wednesday, September 2, at 5:30 p.m. At the Reuben H. Fleet Science Center. The event is free and open to the public.

MAUREEN CAVANAUGH (Host): I'm Maureen Cavanaugh. You're listening to These Days on KPBS. The idea of using robots in warfare used to be science fiction. Now with the Predator drone, the PackBot in Iraq and Afghanistan, and a new generation of combat robots in development, the idea of robots either helping soldiers or taking the place of humans in war is becoming more of a reality every day. But when robots are put in charge of killing human beings, who is responsible when something goes wrong? What kind of morality will come into play when one side of a conflict is sacrificing people and the other merely damaging machines? As part of our monthly series on ethics and technology, we explore the increasingly pressing issue of using robot technology in warfare. How much are we already relying on robotics in our military engagements, and how much more are we likely to in the future? I'd like to welcome my guests. Barbara Fletcher is an ocean engineer and project manager for SPAWAR, the Space and Naval Warfare Systems Command. And, Barbara, welcome to These Days.

Advertisement

BARBARA FLETCHER (Ocean Engineer, Space and Naval Warfare Systems Command): Thank you.

CAVANAUGH: And John Sullins is professor of philosophy at Sonoma State University. He specializes in technology, artificial intelligence and robotics. John, Welcome to These Days.

JOHN SULLINS (Professor of Philosophy, Sonoma State University): Good morning.

CAVANAUGH: And I'd like to invite our listeners to join the conversation. What do you think when you hear about unmanned predator drones bombing in Afghanistan? How far do you think technology will take robots in warfare? Give us a call with your questions and your comments. The number is 1-888-895-5727, that's 1-888-895-KPBS. Well, Barbara, for those of us unfamiliar with the navy's different commands, tell us what SPAWAR is.

FLETCHER: Well, I'm from SPAWAR System Center Pacific, which is one of the preeminent navy laboratories for command control, communications, intelligence, surveillance and reconnaissance. And we do a wide variety of engineering efforts that overarch multiple platforms, so we

Advertisement

not only work with ships, but also submarines, unmanned systems, space-based systems, hence the SPAWAR…

CAVANAUGH: Right.

FLETCHER: …portion, and how everything ties in so that the proper communication and control can occur over the battle space.

CAVANAUGH: And what does SPAWAR have to do with perhaps mechanized, automated robotic devices on the battlefield.

FLETCHER: Well, my particular area is underwater robotics and, in particular, applications to intelligence, surveillance and reconnaissance. And one of the things is, robotics provide an excellent tool for extending our reach, for being able to keep an eye out on things that – in areas where it's not feasible or safe to put human beings.

CAVANAUGH: That is really a – the lion's share of where robotics are being used in the military today. But you must keep up on everything that's in production. What is, for instance, on the ground in Afghanistan, in the air in Afghanistan? What are we using in terms of robotic technology?

FLETCHER: Well, I, number one, no, I can't say I keep up with everything that's going on because it is such an expanding field and there are people who spend their whole careers in any one of those areas you talk about. One of the things that I think that, as a technologist, I've got to say, is that right now the systems being used are generally controlled directly by the human being at the other end. Now the beauty of it from the operator's point of view is that they are out of harm's way and that – but they have direct control over what the robot is doing. They have the direct fire control.

CAVANAUGH: Right. And I want to move over to John because I want to ask you, John, the way that robots are being used today, being controlled basically at a distance, is that – would you characterize that as perhaps the first generation of robotic warfare?

SULLINS: Yes, currently the – we don't really know how to solve a lot of these problems in artificial intelligence, which would be required for a machine to operate completely autonomously. And I think for sometime, at least the next generation or so, we're going to see these machines mostly controlled by humans. What we will see though is the growing capability of these machines to act semi-autonomously. So, currently, like a Predator drone flies itself to a mission and then some – humans get involved when things get a little more dangerous or are more active at the other end. But these things can pretty much get themselves around by themselves very easily. So I would call something like a Predator drone a semi-autonomous machine. What I'm worried about is when the necessities of the battlefield require semi-autonomous fire control because a battlefield is a quick and fast and dangerous place and if our machines – if our military commanders find that our machines are at a disadvantage having to wait for somebody to authorize a fire, I think the – It's just an engineering problem. That – You'll have to make them autonomous in their fire capabilities in order to succeed in the mission that they're designed to do.

CAVANAUGH: Is another – Barbara, is another term for the kinds of robotics that we have on the battlefield now telerobots? What's telerobots?

FLETCHER: Okay, telerobotics is exactly that. It's when somebody has direct telecommunications with the robot and can at least be in what we'd call a supervisory role, overseeing what's happening and that when an occasion comes up that human intervention is required, something that the programming is not yet capable of dealing with or that we deem necessary for human intervention, then they've got the teleoperation. They can say 'turn left' or 'drop the bomb' or 'turn around and come home.'

CAVANAUGH: And just for the sake of understanding, what's the difference between that and an autonomous robot?

FLETCHER: An autonomous robot is out there operating and does not have a direct link back to an operator. And that's the case with a lot of underwater systems that are used for science and even military surveillance things, is that because we don't have the communications links underwater that you do with air or ground, you – they have to have more smarts on their own. They have to be autonomous and able to deal with getting from point A to point B and doing certain tasks.

CAVANAUGH: What can they do on their own?

FLETCHER: Well, again, my area is underwater…

CAVANAUGH: Yeah, yeah.

FLETCHER: …robotics and the big thing right now is surveying. And so what these vehicles are commonly used for is they take – they have extensive sonar suites and mapping capabilities and they can go out and run a path and do very detailed surveys of the bottom. From the military point of view, that kind of technology has been applied to mine encounter measures where they can take the robot, program it, and send it out to search an area. They come back, they look at the data, and say, oh, there's a mine, there's a mine. There's something we aren't sure of…

CAVANAUGH: I see.

FLETCHER: …but we better be careful of.

CAVANAUGH: I'm speaking with Barbara Fletcher and John Sullins. We are talking about the idea of using robots in warfare, where we are with that technology right now, and where we're going. We're taking your calls at 1-888-895-5727. Let's take a call from Chuck in Carlsbad. And good morning, Chuck. Welcome to These Days.

CHUCK (Caller, Carlsbad): Thank you very much and thank – This is a very interesting topic to me because in the old days, I was involved in drone warfare and, however, the warfare was really simply taking pictures and identifying targets. And what we discovered was at the same time we had, in Vietnam, a device called a Light Observation Helicopter, which did the same thing, the only problem is about 50% of the pilots got killed and there was a horrible mortality rate. I'm in favor of robotic warfare. Right now, our enemies are using, I guess you'd have to call it, dumb robotic warfare when they plant mines to kill 16 people, 50 people, 100 people at all times, so we need to use what we can to save human life. Thank you.

CAVANAUGH: Well, thank you for your call, Chuck. I appreciate it. What is the present philosophy behind using robots in wars, John? What's the advantage? Did Chuck basically outline…

SULLINS: Well, I really liked what Chuck had to say there at the end. I think what we're seeing right now is a shift in warfare equivalent to what happened in the renaissance with the advent of gunpowder. The wars will not be fought in the future the way that they're fought – they were fought in the past. And it's a long process. It started in Vietnam. It started in World War II with certain robotic weapons that were used there. We are now seeing, in Iraq and Afghanistan, the first stages of robot versus robot. We have these – What Chuck said was right. A roadside bomb is a kind of low tech robot and we're using high tech robots to try to discover and diffuse these low tech robots. So we've al – we're already seeing the first robot war.

CAVANAUGH: Umm-hmm. Yes, go ahead, Barbara.

FLETCHER: And, if I may chip in, one of the things that inspired me a number of years ago, I'd say about 15 years ago, I was at a conference talking to a veteran of the first Gulf War and he was a navy SEAL and he talked about doing mine hunting by Braille. And, personally, I don't think that's something a diver ought to be doing, looking for mines by touch. That's something a robot should be doing, and I'm glad that, you know, the navy has put the money into the research, into pushing the technology so that, hopefully, the SEALs will be doing other useful things and not putting themselves in harm's way in that way.

CAVANAUGH: On the other side of the picture, though, there – A report in the Washington Post earlier this year by military writer John Pike described robots on the battlefield as stone cold killers. What will it mean to remove the human element in warfare, John?

SULLINS: Well, that's what I'm primarily interested in and what a lot of my writing is concerning these days. I think that this is something that – that can't be done blindly. We have to really think hard about what we're doing. The unfortunate thing is that – is that wars are imminent affairs. They – Problems have to be solved fast and we've got ourselves into these worldwide conflicts which require quick and dirty thinking. What I'm worried, is that we're stepping too quickly into this world where we – we're going to have machines that make decisions: Shoot or don't shoot, kill or don't kill decisions. And I think that that's something that we may come to regret.

FLETCHER: Yeah, although one of the things that I think is interesting – I got into this business about 25 years ago. When I first started, I was told that full autonomy was 10 years away.

CAVANAUGH: Umm-hmm.

FLETCHER: Well, I was just at a conference for – the 30th anniversary of a conference on untethered vehicles and, yeah, we've come a long way in autonomy but nobody's even beginning to say that full autonomy is only 10 years away. So I think what I see is the fact that robots give us the extension, which has its own issues, but I still think the autonomy of the robots doing the shooting without the man in the loop is still a long ways off.

CAVANAUGH: Well, let's talk about some of the ethical issues that arise even today in the use of the present generation of robots that we have on the war – on the battlefield. And, John, I'm wondering, since you're studying this and thinking about this, what does the use of robots – Does it make warfare more ethical? Can an argument be made that what we're doing now makes warfare more just, in a sense?

SULLINS: Yeah, this argument has been made that what we are able to do is take the emotion out of warfare. Remember that quote you said, 'stone cold killers.' Maybe that's a good thing. Maybe, you know, a lot – I think that we just passed the anniversary of the Mei Lai Massacre and a lot of that was the psychology and the anger and the fear of the soldiers involved that led to that massacre. Robots don't worry about themselves. They don't have a right to self-preservation. They're not unlikely – A force that was teleoperated from a distance into a village, for instance, to search it out, the argument goes, would be less likely to create these sorts of horrible incidences. I'm not sure if that's the case but that is certainly the argument that's being made by the people who build these machines.

CAVANAUGH: And yet long-distance killing, in and of itself, is – removes one's sense of responsibility from what one does. You can even see that sometimes with people who are on bombing missions and so forth. So does that also make – does that also enter the – enter into the ethical picture as we look at the use of these devices?

SULLINS: Yeah, that's what I'm studying right now is this concept called telepistemology, that's kind of a technical term but epistemology is how do we know what we know. And telepistemology is how do we know what a machine is telling us is true. So if you look at what a, for instance, a Predator drone gives back to its pilot and it's operators, it's a series of video images and readings on various dials and they have to take that and turn that into a picture of what the battlefield really is. And that's – it requires some mental gymnastics. It's not easy and it's not transparent. And what I think we need to really be conscious of is are we making sure that the information that's coming in to the operators includes enough information so that they can make ethical judgments. A person running from a roadside may be running because they set a bomb or they may just be running because they saw somebody set a bomb. How do you tell the difference? They're going to look the same on the video.

CAVANAUGH: We have to take a short break. When we return, we will continue our conversation on the idea of using robots in warfare, and I'll continue with my guests John Sullins and Barbara Fletcher. We're taking your calls at 1-888-895-5727. Stay with us. We'll be back in just a few minutes here on KPBS.

# # #

CAVANAUGH: Welcome back. I'm Maureen Cavanaugh. You're listening to These Days on KPBS. As part of our monthly series on ethics in technology, we're talking right now about the increasing use of robot technology in warfare. And my guests are Barbara Fletcher, ocean engineer and project manager for SPAWAR, that's the Space and Naval Warfare Systems Command. And John Sullins is professor of philosophy at Sonoma State University, specializing in technology, artificial intelligence and robotics. We're taking your calls at 1-888-895-5727. And let's take a call from Kevin in La Jolla. Good morning, Kevin, and welcome to These Days.

KEVIN (Caller, La Jolla): Oh, good morning. Thank you for this program and taking my call. I was watching CSPAN over the weekend and saw a program involving several neocon analysts who were arguing what it would take to, quote, unquote, win in Afghanistan. And, of course, the advice was we need a lot more troops and a lot more money, kind of like what Johnson was told in '64. In any event, when questions eventually were put to the panel, people were asked, well, why is Afghanistan so important? I mean, why do we really stay there? And finally one gentleman by the name of Riddell, I believe, who's a fairly well known neocon who was involved in the Bush administration, just came out and said, well, we really need to have presence there because we need a base to physically launch drones and we can't do it, technologically, outside of the country itself. Now what I have heard is that these drones have collateral damage built right inside of them and they inevitably, if we take out some Taliban leader, we kill a lot of innocent people in the family and people at wedding celebrations and that kind of thing. Now I'm sure this man Riddell has probably never worn the uniform but certainly never seen killing. I was a Marine in Vietnam. It was a long time ago but I can tell you, I saw things I never ever want to see again. And there is a real deterrent on the field with battled-hardened men and now women, I guess, to refrain from pulling the trigger those several extra times and killing, you know, the woman and the child and so forth. The drone doesn't have that restraint.

CAVANAUGH: Kevin, I want…

KEVIN: I'd just like to ask about the ethical implications to that. This quest – What this man said chilled me right to my bone. He was very, very cavalier about it.

CAVANAUGH: Kevin, I really appreciate the call and the question and let's talk about that. Thank you so much. And I wonder, Barbara, is ethics something the military assesses when it considers implementing a robotic combat system?

FLETCHER: Oh, absolutely, and that goes at all levels. I mean, we're all human beings. We all do – have our job to do. And there are obviously variations amongst individuals but I think one of the key things is keeping an eye on the overall picture, being able to do the job effectively with a minimum of cost, and that's human cost as well as monetary cost.

CAVANAUGH: What about Kevin's point, John, that this man apparently said that they need to have a presence in Afghanistan in order to have a base for Predator drones. Does that make any sense to you?

SULLINS: I can imagine that being said, yes. And what – This is what I'm worried about, is that this new style of warfare is allowing us to think in these generational terms as far as warfare goes. As a philosopher, I study a theory called Just War Theory which, you know, war itself is probably not an ethical undertaking but if it can be ethical at all then it would have to be under this Just War rubric of thought. And one of the primary tenets of Just War Theory is that a war has to be fought quickly into a lasting peace. Your actions in a war are only just if they lead to an action – to a lasting, active peace. What I'm worried about is that these weapons allow us to be so distant from the conflict that we can propagate them for generations and generations with very little human cost on our end but extreme human wreckage on the other end.

CAVANAUGH: Right, what does that do warfare? I mean, obviously there have been rich nations and poor nations trying to fight each other to the disadvantage of the poor nations for time immemorial. However, but when one country is sacrificing human beings, the other country is destroying machines, that seems to up the ante in unfairness a little bit, would you agree, John?

SULLINS: Well, I don't know if we can talk about fairness in war. That – that's not what it's about. But we can talk about just behavior and proportionality. In Just War Theory, we can only propagate a war to the level that our opponents are forcing us to. And this technological advantage that we have will definitely change the rules of the game and I don't know if it's going to work the way we think it does because we will have – our enemies and even our friends are starting to become skeptical of this particular technology.

FLETCHER: I think…

CAVANAUGH: Yes, Barbara.

FLETCHER: I think John just raised a good point. Rules of the game, there are very formal rules of engagement. And the question a moment ago about is the military concerned, and absolutely, they have very formal rules of engagement. And from my point of view, I see some of this robotic warfare as actually helping sustain the rules of engagement in that you're not having people who are at fear for their lives, their squad around them, and having to do blanket shooting to get out of a place that when they're controlling a robot, they can go after the specific target, they can respond according or – to the rules of engagement without having the emotional reaction that they might if they were immersed in the situation.

CAVANAUGH: Let's take another phone call. Alan is calling from North Park. Good morning, Alan, and welcome to These Days.

ALAN (Caller, North Park): Good morning. I am a person who has, for a long time, believed that this Just War Theory is totally obsolete and that we have the wisdom to go ahead and work to find alternatives to war and warfare. And, unfortunately, the intelligence to create these robots has outpaced our wisdom as a society to work for and always, always find alternatives to violence.

CAVANAUGH: Alan, thank you. And that sounds like a question for a philosopher, John.

SULLINS: Yeah, I think I agree with Alan in that I – we need to work in that direction the hardest and that's – It seems like the robots might move our attention away from that because they allow us to continue with the behavior that is destructive and dangerous in a way that is less costly for us and so it may take away our incentives to – for – to look for the more challenging and the more difficult peaceful solutions.

FLETCHER: Well, I would agree with that to a certain extent and if the rest of the world felt that way, that would be a wonderful thing. Since many of the systems I work on are defensive systems that do surveillance and reconnaissance, things like inspecting ship hulls to make sure that nobody's planted mines on them, inspecting harbor areas to make sure nobody's come in, you know, I see the robotics as being a tool for our protection and helping to allow us to take the higher road, hopefully, along the way, that the technology itself is not necessarily making us more blasé about what – the impact of war.

CAVANAUGH: I want to talk about the idea that other nations will get on this bandwagon and start deploying their own robotic combat systems and so forth. And I'm wondering, if two countries are fighting each other with basically robot soldiers, okay, isn't it possible that the only people who will die in warfare are civilians? John?

SULLINS: Well, certainly it'll be the people that are around the battle field that will be taking the brunt of the damage and you're right about that. One of the things that I think is another point we should bring up is that the people who control robots – Like a pilot at one point in our past used to have – tended to be an officer. Current drone pilots don't need that kind of training so they're being less and less trained so we're – And why I want to bring this up is that we're losing the concept, slowly but surely, of the professional soldier, the one who ostensibly would be trained in – at least in the rules of engagement not Just War Theory and was a professional about what they did. And we're offloading much of that knowledge onto the machine itself. I hope we do a good job with that. I'm skeptical of whether we can.

CAVANAUGH: You talked a little bit about this new idea of shift soldiers. What does that mean, John?

SULLINS: Well, current – like, for instance, the drone pilots, most of them, operate out of a Air Force base in Las Vegas. And they can be fighting the war in Iraq or Afghanistan for a few hours and then go home and hang out at the pool and go see a movie or something and come back and fight the war again. And what we're already starting to see is that the mental gymnastics required by that are pretty intense. These are – Even just watching destruction on a video screen is a traumatic event for these people, and I’m not sure we're being smart about that or realistic in this thought that we can somehow make war a nine to five job, that it's not a horrible thing that needs to be avoided. Instead, it's a career.

CAVANAUGH: What about when robots – when there are kinks in the military's robotic systems and perhaps a robot goes bad and the system causes destruction? Whose responsibility is that, Barbara?

FLETCHER: Well, number one, largely when something goes wrong, what happens is things stop. That – and that's one of the areas where, you know, the navy laboratories and the scientists, the technologists, a great deal of effort is put forth for the command and control and the failsafe operations that, yes, things go wrong and typically what that means is you're down one robot. So that's – at this point makes for good science fiction stories…

CAVANAUGH: Uh-huh.

FLETCHER: …and movies. It has not really been a case in reality.

CAVANAUGH: But when we do take that extra step, when we get to an autonomous robotic system, you talked about the rules of engagement…

FLETCHER: Umm-hmm.

CAVANAUGH: …Barbara. How do we teach a robot the rules of engagement?

FLETCHER: Well, it would be programming. You – If x and y are met, you shoot, or you do this. But if x and y do not happen, then you shut down. And essentially that's the way any system works from your, you know, cell phone to your computer.

CAVANAUGH: And I'm wondering, John, when you look at how this is all coming about, how this is expanding, what are your major concerns?

SULLINS: Well, we've talked about a couple of them already but I'm concerned that warfare will lose its – our fear of warfare will be diminished somewhat in that we will feel that we can propagate these adventures with little or no cost and, therefore, be more cavalier about their use. That's one major concern. And the other is that we're – we've – we're taking these machines, which are wonderful and amazing things—I love robots and artificial intelligence—but we're applying them to the most ethically challenging human activity imaginable. I really wish that we had just spent more time with them just doing, you know, surveying and firefighting and more peaceful activities where we could've learned the ins and outs of how to construct these things before applying them to such a disastrous human undertaking.

CAVANAUGH: You sound as if you think that this technology is increasing exponentially. When do you think you might actually see an autonomous robot fight – conducting warfare?

SULLINS: Well, I'm going to – I don't – I think relatively soon. I'm a little more sanguine about it than Barbara because I believe that we've got – When these war started, we had maybe less than 100 robotic weapons in the U.S. Army, now we have thousands. By 2010, it's a mandate to have a third of our aircraft be unmanned and by 2015 a third of our ground fighting vehicles unmanned. And so that kind of pressure, we're spending so much money on this. It's one of the greatest human undertakings right now, is what – the money that we're putting into unmanned fighting vehicles so that's going to pay off. And I think we're going to have at least significantly semi-autonomous vehicles within ten years.

CAVANAUGH: I want to thank you both for speaking with us, and I want to let everyone know that the next Ethics Forum: "RoboWarfare. Is the World a Better Place When Robots Fight Our Wars for Us?" is taking place Wednesday, September 2nd, at 5:30 at the Reuben H. Fleet Science Center. The event is free. It's open to the public. If you want more information, you can go to KPBS.org/TheseDays. And if you would like to post a comment about what you've heard on this topic today, also you can go to KPBS.org/TheseDays. I want to thank my guests. Barbara Fletcher, thank you so much for being here.

FLETCHER: Thank you.

CAVANAUGH: And John Sullins, thanks so much.

SULLINS: It was a pleasure. Thank you.

CAVANAUGH: You've been listening to These Days on KPBS. Stay with us for hour two coming up in just a few minutes here on KPBS-FM.