Ethical Engineering Means Choosing Your Work

I once heard a shocking comment from a colleague at a previous job in the aerospace industry. My colleague told me, “Joe, I think you’re a lot like me – you don’t really care about what your work is for, so long as it involves solving challenging problems.” I walked out of that conversation with a notion solidifying in my mind: if that company had that impression of me, I had to get out.

When engineering students, or young professionals, think about engineering ethics, they usually deal with topics narrowly pertaining to their problem-solving work. For example, making sure that they report correct results, that they do not misrepresent their data, or that they raise issues they find to their management. Human safety is often the main focus, especially preventing injury or loss of life due to improper operation of the engineered system. A classic example is the o-rings on the Space Shuttle Challenger‘s solid rocket boosters: an engineer had an ethical obligation to raise his concerns to try and save the crew, and the engineering management suffered an ethical failure in refusing him.

But what abut the ethics involved in the proper operation of a system? There is an aspect of engineering ethics that rarely gets attention in engineering instruction: an engineer’s ethical responsibilities in choosing which projects and programs to work on. A wonderful essay by Darshan Karwat on this subject appeared on the Union of Concerned Scientists’ blog recently. As an aerospace engineer – a canonical “dual-use” discipline, meaning it has both civilian and military applications – I offer my own opinion here.

The foundational hero of American rocket science is Werner von Braun. The story goes something like this: in Germany, von Braun determined that he wanted to send rockets, and people, to other planets. He found any way he could to pursue his passion – even working with Germany’s Nazi regime. But, as Allied forces pushed into Germany and the Nazi military crumbled, von Braun quickly defected to the American soldiers. In the United States, he became our premier rocket designer. He brought Americans into space. He built the mighty Saturn V. He landed astronauts on the Moon.

Von Braun had a terrible dark side, though. He wasn’t forced to help the Nazis – he was eager, accepting an officer’s commission in the Nazi SS. The V-2 rockets he developed, in full knowledge that he was relying on horrific slave labor, were intended to kill civilians in England. (Perhaps apocryphally, he supposedly commented that the V-2 was perfect except that it landed on the wrong planet.) He switched allegiances at the drop of a hat, looking for whichever masters would support his work. And, in America, the first fruits of his labor were not Moon rockets for NASA, but intercontinental ballistic missiles for the Army – the most reprehensible weapons humankind has yet produced.

On balance, Von Braun was evil.

The trap for engineers today is in thinking like my former colleague – that engineers are just solving technically challenging problems, and it’s up to somebody else to decide how to apply the technologies we invent. You may also be familiar with this variation of the argument: technology is neither good nor evil; it’s how we use it. Both statements punt decision-making responsibility to somebody else, ostensibly national leadership. Yet, with that leadership mired in bureaucracy, groupthink, corporate lobbying, and an ever-shrinking attention span, the best-equipped experts in technologies and their potential applications are not the people at the top, but the front-line engineers.

I would argue that we were never really off the hook in the first place, but we are coming to the end of an era where it’s acceptable for us not to ask questions about the ethical ramifications of our work. In the past few years, virologists have grappled with whether it is ethically okay to make an infectious disease more deadly, in the interest of understanding how that deadliness arises. In 2018, the computer engineers who brought us Facebook and Twitter are grappling with the ethical consequences of the fact that their algorithms polarized the United States, fanned the flames of far-right extremism, and opened the US up to foreign propaganda. It is long past time for aerospace engineers – participants in a field of technologies that could bring us globally devastating war – to confront the ethical albatrosses of our field. I contend that front-line aerospace engineers should refuse to work on programs involving such things as nuclear weapons, hypersonic weapons, antisatellite weapons, and ballistic missile defenses, on the grounds that all of these military systems make our troops, our population, our nation, and our world less safe.

It falls to us engineers to consider the ethical implications of our work, and this includes asking a lot of hard, introspective questions – questions that may seem more like political philosophy than engineering. My inclusion of ballistic missile defenses in a list of suggested proscribed programs may seem confusing, however, it makes for a terrific example of how we must consider our work through a much wider lens than the “interesting technical problem.” There are three key reasons why working on missile defenses is ethically dubious to me.

First, there is a very common misconception about ballistic missile defense, promulgated these days by news agencies reporting on North Korean nuclear missiles: that missile defenses exist to protect the American homeland and civilian population. In truth, the population is too spread out to defend – and the purpose of missile defense is to protect military assets deployed to contentious areas, such as an aircraft carrier in the South China Sea or a Middle East forward operating base within striking distance of Iran. So, missile defense isn’t really a purely defensive system, but a component of an offensive system. (This is why the Pentagon, an agency not very good at self-reflection or putting itself in others’ shoes, gets wound up about the threat of other countries deploying missile defenses.) Since these defenses only appear as part of offensive systems, they can only be as ethical as deploying the offensive system in the first place. Second, missile defense gives its protected soldiers and sailors a sense of security. This might seem like a good thing – however, remember that these troops are deployed to contentious places. A sense of security can make them more likely to place themselves in harm’s way, and then get hurt. Worse, it could make them more likely to provoke their adversaries, thinking themselves invulnerable to attack! Which leads me to my third point, one that ties into more traditional interpretations of engineering ethics: that sense of security is false. In carefully controlled test conditions, missile defenses have a modest success rate. However, operationally, American missile defense simply does not work. According to the linked article, military officials may even measure success criteria that skew towards a high number of successful interceptions, by allowing an interception to be classed as “success” even if the incoming warhead actually hits its target and detonates. What’s more, there is no reason to expect that this poor track record can be improved: the most intractable sensing and control challenges are on the defensive side; it will always be easier and cheaper to foil missile defenses than to develop an effective defense. It is the ingrained thinking of our government that leads it to continuously fund missile defense development. Considering these three points, while ballistic missile defense pays engineers’ mortgages, I do not understand how a self-respecting engineer can be ethically engaged in this work.

It’s difficult to engage in the philosophical discussions needed to make these determinations – especially when such a large portion of the funding for aerospace engineering work comes from the US military. But, as Karwat’s essay for the Union of Concerned Scientists puts it,

In today’s political climate, engineers cannot remain passive and allow legislators and politicians to decide what the “public good” is. All members of a community must be engaged and responsible in deciding what the public good is and how to create it—and that goes especially for engineers and the companies they work for, because they can have a disproportionate and lasting impact on a community.

Several engineering societies’ codes of ethics speak to this point, directly or indirectly, as the essay mentions. The American Institute of Aeronautics and Astronautics, a trade group of aerospace engineers, also has one, and the top item on its list is (emphasis mine):

  1. Hold paramount the safety, health, and welfare of the public in the performance of their duties.
    1. Recognize that the lives, safety, health and welfare of the public are dependent upon professional judgments, decisions and practices.
    2. Seek opportunities to be of service in professional and civic affairs and work for the advancement of safety, health, and well-being of our communities.
    3. Report suspected violations of this element of the code to the proper authority and cooperate in furnishing further information and assistance as required.

Deciding not to think about public safety, public welfare, and the well-being of our communities when staffing a project is an unethical luxury that engineers have engaged in for far too long. When somebody presents us with an interesting technical challenge, we must first ask ourselves: if this technology gets produced and deployed, does that enhance the public good? Does it make my community more safe? Does it increase public well-being? Choosing not to ask these questions is an abdication of our ethical responsibilities as engineers. And if we are skeptical of a project, providing a positive ethical justification is the responsibility of those who conceived and funded it.

We cannot think of ourselves purely as a local community. The world is full of global challenges that impact us in our day to day lives – everything from the complex effects of economic globalization, to refugees streaming out of Syria, to retweets shaping international opinion, to global warming causing resource shortages and disaster damages. We must think on how the fruits of our labor impact affect all these consequences and more. We must be proud to engage in debate about our work, and we must not hesitate to refuse work that contravenes our principles.

Especially when our national leadership gives up its moral and ethical authority, we must all take up this responsibility.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.