You've found a serious security problem in your company's web application—one that puts your customers at risk of identity theft. Despite your protests, the problem is given no attention and persists for several weeks. Would posting an anonymous message to a public mailing list alerting your customers to the problem be an ethical thing to do? If your employer finds you out and fires you, is that a principled act or a dastardly one?
Ethics is the branch of philosophy concerned with morality—good and evil, vice and virtue. How can we evaluate an act as being right or wrong? When faced with an ethical dilemma, how can we make the best choice? While people may proclaim their own system of morality to be the only correct one, all systems of ethics have deficiencies and criticisms. In the end, each of us is left to decide our code of ethics for ourselves.
Philosophers have grappled with this problem for millennia, and three main threads of thought have emerged. The first is teleological ethics, where "right" is defined as what leads to the best consequences. This encompasses theories such as utilitarianism, which holds that one must pursue "the greatest good for the greatest number of people." The second is deontological ethics, where "right" is defined by duties and rules, such as "It is wrong to lie." Here, we find the divinely ordained moral codes common to various religions, as well as the idea of the social contract—a set of rules by which people who unite into a society agree to abide. Finally, we have virtue ethics, which takes the question "Is this act right?" and turns it on its head. Instead, it asks, "What would a virtuous person do?" At first brush, this seems like a very circular definition of ethics, and it has been duly criticized as such. However, when faced with a moral dilemma, the answer to the question "What kind of person do I want to be?" can provide penetrating insight into the merits of one's choices.
While they employ different means of argument to get there, these three schools of ethical reasoning have a considerable amount of overlap in acts and precepts they deem acceptable. Most notable amongst these is "The Golden Rule," frequently stated as "Do onto others as you would have them do unto to you." This formulation often comes under fire as being too facile, and, indeed, if you look at it on a superficial level, you will find superficial problems. What about if people like to be treated differently than you do? Could a thief not argue that since a judge wouldn't want to be sent to jail, the judge shouldn't send him to jail either? This thief, however, would probably prefer that anyone stealing from him be duly punished by the judge. Thus, a less pithy but more comprehensive way of expressing the Golden Rule might be "Treat people the way you would like to be treated if you were in their shoes."
What does this have to do with testing software? Nowadays, software has a profound impact on people's lives. Software's proper functioning—or lack thereof—dictates whether or not people get correct utility bills, a mortgage from their bank, or particular attention from law enforcement. In the case of medical or embedded software especially, software malfunctions may result in physical injury and death.
As software professionals, then, what are our responsibilities? Surely we have many of the same duties we would in any workplace. We have obligations to our peers and customers, such as dealing with them respectfully and with honesty. We have obligations to the company we work for, like performing our work diligently. However, by virtue of the specialized knowledge and insight into software we have that the general public does not, I submit that we have a concrete ethical responsibility to society as a whole to use that knowledge in a way that protects the public good. As software is a machine of our creation, it is our duty to prevent it from visiting harm upon the populace. This duty to uphold the public good must supersede even our responsibility to our client or employer, lest those unconcerned with the ethics of an act be permitted to squelch objections from those who are.
Thus, as testers, we must not only do the best testing we can muster but also advocate for the users, trying to voice their interests where they cannot. We must critically analyze software systems and object to elements that are harmful to users, society, or public welfare—possibly enduring censure as a result. In addition, managers must cultivate trust amongst their charges and take their ethical concerns seriously, so that issues raised can be addressed before they start to snowball.
If we wish to be seen as a body of professionals, we need to adhere to professional ethics. Unfortunately, there currently is no widely agreed-on ethical code for software testers. This is a shame, since a common code helps in three ways:
- It provides guidance by outlining specific ethical standards and rules that the profession is expected to follow.
- It supports testers who raise ethical concerns by demonstrating that ethical commitments are not arbitrary whims of a single individual.
- It is an expression of the shared values of the testing community, and it carries a concomitant expectation that a tester who suffers retribution for making an ethical decision will enjoy the support of the community.
Reflect upon your personal ethics. What are and aren’t you comfortable doing? What standards would you like the people who test the software you use to uphold? While it can never make difficult decisions easy, knowing your own ethical code provides a measure of certitude should you ever be faced with a difficult ethical problem.
Analyzing a software project's ethical ramifications is as much a part of testing as analyzing a program for likely failures. As such, it behooves us to think about what a tester's code of ethics might look like and to discuss it with our peers. If we aspire to be professionals who can critique and evaluate software from a vast range of perspectives and who can make moral judgements that stand up to critique, then these are matters we must address.
Check out the other articles in Rick's series on philosophy and software testing:
User Comments
I think this "Ethics" dilemma is nothing but a lacuna in our moral strength, and it is not constrained to any Industry or Social community.
Whenever the courage of raising a voice against a institutionalized process is flagging in one's mind (a whole lot of reasons behind this), he presents this situation like a Dilemma and interesting part is that even he is unaware of the very fact.