How the UI of Military Drones Affects Empathy

How does the user interface of unmanned combat drones affect the emotional processing of the military personnel operating the aircraft?
UI evaluation
user behavior research
explorative

Abstract

The destruction caused by military actions may result not only in the loss of life, but also the demise of physical locations that hold historical value and cultural content such as artwork, homes, and ways of life. As “boots on the ground” are being replaced with robots and unmanned aircraft for the sake of safety and financial concerns[1], soldiers may begin to lose touch with what it is like to be on front lines.

When embarking on this topic of research, I originally speculated the answer to this would be “black or white”; that soldiers interacting with semi-autonomous weapons would either show decreased or no change in their empathetic responses.

Upon investigation, I now recognize that the answer is not static. Designers and military personnel have a chance to consciously redirect the outcome of this technology as it pertains to soldier emotional response. As of today, without the proper training, work environment, and information about the attack, soldiers will resort to dissociation or emotional numbing techniques to suppress their natural human empathic reaction.

Users of machine learning technology on the battlefield are likely to become desensitized and less empathetic of the people and cultures affected by the actions they choose to take as a result of the guidance provided by the machine learning programs. I believe that we can change these circumstances to benefit the military, the individual soldier, and the unintended victims of these attacks.

Keywords: military, machine learning, semi-autonomous, empathy

Machine Learning and the Military

On April 26, 2017, Deputy Secretary of Defense Robert O. Work released a memorandum establishing the Algorithmic Warfare Cross-Functional Team (AWCFT) also known as Project Maven. For the United States to stay ahead of its adversaries, it is crucial that the military begin researching and implementing artificial intelligence (AI) and machine learning (ML) to develop weapons and communication systems. The primary application of technology generated by Project Maven will be pairing image and video recognition with unmanned aircraft to collect and understand large amounts of data[2]. Soldiers, prone to fatigue, emotion, and confusion, are not equipped to quickly and efficiently process and interpret this data without the assistance of machine learning[3].

Some of the most powerful weapons systems currently in use already have “AI-like” elements. Predator drones, torpedoes, and cruise missiles are early examples of such technology. The development of smart bombs reduced the circular error of probability from 3,300 feet in World War I to just ten feet in the Gulf War, thus decreasing the unintentional loss of life.  When troops entered Iraq in 2003, unmanned aircraft were an oddity. Six years later, over 5,300 unmanned aircraft were in use. That same year, the U.S. Air Force trained more soldiers to remotely control these drones than they did to pilot manned aircraft. Because of these advancements, some human right advocates believe that only “smart bombs” should be used in war to prevent civilian casualty[4].

Fighting Environments

In a YouTube video titled “U.S. Army Soldiers Provide Mortar Support in Afghanistan,” posted in March 2012, viewers attain a sense of fighting on the frontlines. The camera is on a table in a dimly lit room. We see two men huddled around a monitor. A third man walks behind them holding a machine gun. We hear the popping of bullets. The view becomes shaky as our “cameraman” picks up the camera and prepares to exit the building alongside other soldiers. Someone yells, “Three, two, one, go!” and our cameraman is running. He sets the camera down again.

Within 29 seconds of the camera coming to rest, the team has fired three rounds from a large mortar gun. With each round, a soldier approaches holding a projectile with both hands, drops it down the barrel of the gun, and ducks; within a second, the projectile is firing where his hands once were. In the next three and a half minutes, the soldiers communicate by yelling at one another over the noise of the battle and physically re-adjust the directionsof their guns. When the noise of gunfire dies down, we hear the team panting and cursing as they reset their station. One man says, “They almost shot me in the arm… those [explicative] had us locked in, man. From right here, we couldn’t even get to our [missile rounds]. They kept digging off our [explicative.]”[5]

In another video titled “AC-130 Afghanistan Mission”—posted to YouTube in October 2011 and then viewed  over 2.2 million times—we see a drone strike through the drone’s camera. On a black and white screen with a wideset crosshair, we observe what the video describes as “terrorist training grounds.” The camera uses a thermal imaging system. Humans appear bright white next to the dark vegetation and buildings. Four men communicate over a radio system regarding the buildings and make a conscious effort to confirm which of the two buildings is a mosque. One man says, “Do not engage the mosque.” Another responds, “The square building is the mosque?” and quickly a third man interprets by saying, “The rectangle! The rectangle is the mosque!” Four minutes into the video, one voice confirmations that the drone can strike. For five minutes, they follow individual targets as they run away from the blasts. The drone does not have a microphone so we cannot hear the blasts. Over the radio we hear, “You got the other guy. That was within two feet of him” and “That one’s still crawling” when referring to their targets.[6]

Emotional Impact

For this paper, empathy is defined as “the ability to experience and understand what others feel.” Research has found that people place “high emotional value on objects” such as robots, who have backstories[7]. Since connections such as those can be established with  non-living objects such as a robots, I initially believed a soldier experiencing a traditional sense of fighting would feel more empathy towards the people and cultures affected. A soldier on the front line has the chance to create a backstory. They live for months at a time in barracks built in the desert. They have seen the homes, markets, and families of the natives in the cities nearby. They experience the force of the projectile leaving their gun, hear the impact on the ground, and feel the return fire. Through this experience, they begin to value human life more than ever. They are always in immediate danger. The U.S. soldier is fighting for their life and what they believe in. Many of these soldiers begin to acknowledge that the enemy is also fighting for their version of those same things.

In November 2015, four former U.S. Airforce drone pilots spoke out about their experience of operating semi-autonomous weapons. One of these men, Michael Haas, flew drones over Afghanistan for six years. He said his office, outside of Las Vegas, Nevada, was often filled with “colorful language” to describe targets during the strikes. He and his peers would use phrases like “cutting the grass before it grows out of control” and “pulling the weeds before they overrun the lawn” when referring to the killing of targets. If they happened upon children, they would call them “fun-sized terrorists.” During one of Haas’s experiences training new soldiers, he failed a student who claimed a group of civilians in Afghanistan “looked suspicious” but had no evidence as to why he wished to use the drone to attack. Haas failed the student but was reprimanded by his superiors. He was told the Air Force “needed bodies” to operate the unmanned aircraft.[8]

Emotional Impact (cont.)

Haas is quoted saying, “Ever step on ants and never give it another thought? That’s what you are made to think of the targets – as just black blobs on a screen. You start to do these psychological gymnastics to make it easier to do what you have to do – they deserved it, they chose their side. You had to kill part of your conscience to keep doing your job every day – and ignore those voices telling you this wasn’t right.”[9]

Another former airman, Brandon Bryant, shared how situations he was involved in including considerable uncertainty regarding if a target was a genuine threat. The U.S. government does not publicly release drone strike data which includes how many civilians died in an attack. However, the website “Out of Side, Out of Mind: A visualization of drone strikes in Pakistan since 2004” uses data collected from the Bureau of Investigative Journalism (whose reporters are at the scenes as primary researchers) to share this information.[10] By their numbers, an estimated 724 children and civilians were killed in strikes. 2565 casualties were categorized as “other,” meaning their status of insurgent or civilian could not be confirmed or clearly defined. Only 52 of the 3,341 deaths were high-profile targets.[11]

This uncertainty contributes to the mental stress drone operators feel on the job. At the 480th Intelligence, Surveillance, and Reconnaissance Wing at Langley Air Force Base in Hampton, VA, drone operators and their commanders are taking a proactive approach at acknowledging the effect drone strikes have on the operators. One soldier, named “Alicia” for the article, mistook a group of women and children for insurgents. Due to the poor camera quality, she came within a few moments of ending their lives. After this near tragic incidence, Alicia requested and was subsequently granted a special security clearance to talk with Air Force psychologists and analysts regarding the drone attacks with which she was involved. Alicia stated, “We were striking a lot at that time, and for me, it felt like I wasn’t getting enough of the story behind the strikes… I felt like I didn’t know enough.” Alicia’s Commander Col. Jason Brown, states his soldiers find mass graves and witness executions and torturing. A study found that one in five drone operators had remotely witnessed a rape while on the job. They cannot look away because of their job duties.[12]

Haas’ description of doing “psychological gymnastics,” Alicia’s need to become more informed, and Col. Brown’s admission that drone pilots cannot look away may be creating an environment in which it becomes more likely a soldier will develop a dissociative subtype of PTSD. The National Center for PTSD describes dissociation as the individual’s attempt to defend against a traumatic experience. Dissociative PTSD can occur when “confrontation with overwhelming experience from which actual escape is not possible, such as childhood abuse, torture, [and] war trauma challenges the individual to find an escape from the external environment as well as their internal distress.”[13]

The Role of the Designer Moving Forward

When reviewing the work environments of Haas in Nevada, and Alicia in Virginia, three main factors emerged which I believe contribute to the emotional response of semi-autonomous drone operators. Those factors include operator training, work environment, and access to information regarding the attack. Simply changing training strategies to factor in a human element of the targets, establishing a work environment where children are not called “fun-sized terrorists,” and increasing knowledge about the attack does not factor in the complexities of working within the military system. Nor does it lessen the unintentional loss of life, which is the burden many of these soldiers carry.

The military does not want a more empathetic soldier. An empathetic soldier will hesitate to pull the trigger and disobey orders. An empathetic soldier will cost their employer more time and resources as they attend therapy to address guilt and grief.

Designers walk the line between problem-solving for function, but also for user interest. Designers have the opportunity to play a role in the implementation, user interface design, and training systems for introducing this tool to soldiers so that they may understand the purpose and outcomes of their actions.

Moving forward, the designers of these weapons must implement a user interface and observational system that does not necessarily increase (or decrease) empathy but uses machine learning to become more accurate in their destruction of targets. This increased accuracy will lessen uncertainty so that soldiers do not have to resort to emotional numbing to complete their job.