Ethical Software Engineering

18 Apr 2019



    Computers are binary, one and zero, on and off. Syntax is either right or its. As programmers, we get used to this clear-cut distinction and for many of us, this level of clarity helped draw us to the field in the first place. After all, we like to solve problems. We like to know when we are correct, and the computer will always tell us in one way or another. However, while writing code may be done in a black and white world, the programs that we create exist in a complicated reality dominated by the gray area the lies between. Once we release software into the world, we have no control over how people will use it. So, what are our responsibilities as creators of technology, and to whom are we responsible? A software engineer should answer these ethical questions before writing a single line of a new program.

    So, what are our responsibilities as creators of technologies? It is not reasonable to expect a programmer to predict and account for every possible usage of a program, in fact this would be impossible task. Yet I believe that one should create software with these usages in mind. In fact, I would go as far as to say that it is just as important to consider the negative consequences of a new program as it is to focus on the benefits. Software has permeated the deepest cracks of our world and for this reason, we must consider everyone. I do not mean everyone who might use our program, but everyone in the world. While this concept may sound ridiculous in scale, I stand by it. A software developer needs to at least consider what might happen. If I have learned anything about this world throughout my life, it is that if anything has the potential to cause harm, someone will take advantage. So, we must ask ourselves, is the good worth the risk of the bad? Do those who will benefit outweigh those who might be harmed? These questions have no simple answer, and this complex answer will be different for every programmer and every program. To further explore the topic of ethics, it is necessary to look at a real-life example. In 2018, the New York Times published an article called, Delay, Deny and Deflect: How Facebook’s Leaders Fought Through Crisis, which delves into a contemporary example of the complexity of software ethics.

    First, I believe it is necessary to make the comment that every program is not Facebook, for they are one of the most influential pieces of software in our connected world. Yet, it is because of this central importance in so many of our lives that their example is perfect for contemplating ethics. For a little background, Facebook was a central launching point for the Russian intelligence operation that famously meddled in the American presidential election of 2016. They did not intend to be used as a platform for an attack on a democracy, this is clear, and yet their software was used in exactly this manner. Yet, when evidence started to come out of the extent to which their platform was used, the Facebook executives had to make a decision on how to react. Instead of openly accepting the reality that their company was being manipulated by an intelligence agency hostile to their home nation and working to resolve this issue, those in power at Facebook chose to deflect and deny. They asked the question, to whom are we responsible and answered, shareholders. Facing criticism, they began a public relations campaign designed to spread the blame to others. They sought to implicate their economic rivals Google, Twitter, and Apple and even went as far as to suggest that people were attacking their company with an anti-Semitic motive.

    The ethical response to this situation would be to examine the facts, share them with the American people, and take strides to prevent a repeat of this problem, and thus minimize the harm. Facebook was built on the foundation of user content, and they have been highly successful by letting people create nearly anything they want to. What would people think if Facebook began censoring user’s content? I believe they felt threatened by the possible bad publicity resulting from the Russians and focused first on protecting their shareholders. Yet, this policy of denial only went so far as the hoard of evidence continued to grow. Faced with the inevitability of a losing battle, Facebook executives began to change their tune, accepting that their company was involved and looking for avenues to do good, such as preventing sex trafficking. While I believe that these avenues were overall beneficial to the world I have no doubt that they took them in order to maintain the brand of Facebook. This was particularly apparent after it was leaked that they had allowed personal information of millions of users to be fall into the hands of Cambridge Analytica, a political research company who had no rights to this data, which leads me into the final point I would like to make.

    Data is everything. I get that. I understand that personal data is what drives fortune 500 tech companies like Facebook, Google, and Twitter, to name a few. They make money by selling access to users, and this begs the question, what responsibility do these companies have to these users? My answer, the same responsibility that all software engineers have to their users. They need to make their policies regarding how personal data will be used perfectly clear, but more importantly, the need to give the users the ultimate control over this data. To me, this failure to let users maintain control over their own data is Facebook’s ultimate ethical shortcoming and is in fact the reason that I no longer use them in a capacity beyond checking messenger every few days.

    Ultimately, I have a great deal of respect for Mark Zuckerberg and Sheryl Sandberg as business people, but I do not envy their ethical convictions. They cannot prevent Facebook from being used to spread misinformation, but they could do more to prevent it. They cannot make nearly as much money without exploiting personal information, but they could do more to allow people to opt out and protect themselves from overreach. Facebook is an example of a software company that failed to act in an ethical manner and one that every aspiring software engineer should learn from.