Here's one way to think about the ethics of software, in terms of multipliers. Think back to the last major email virus, or when the movie "The Two Towers" was released. No doubt, you heard or read a story about how much lost productivity this bane would cause. There is always some analyst willing to publish some outrageous estimate of damages due to these intrusions into the work life. I remember hearing about the millions of dollars supposedly lost to the economy when Star Wars Episode I was released.

(By the way, I have to take a minute to disassemble this kind of analysis. Stick with me, this won't take long.

If you take 1.5 seconds to delete the virus, it costs nothing. It's an absolutely immeasurable impact to your day. It won't even affect your productivity. You will probably spend more time than that discussing sports scores, going to the bathroom, chatting with a client, or any of the hundreds of other things human beings do during a day. It's literally lost in the noise. Nevertheless, some analyst who likes big numbers will take that 1.5 seconds and multiply it by the millions of other users and their 1.5 seconds, then multiply that by the "national average salary" or some such number.

So, even though it takes you longer to blow your nose than to delete the virus email, somehow it still ends up "costing the economy" 5x10^6 USD in "lost productivity". The underlying assumptions here are so flawed that the result cannot be taken seriously. Nevertheless, this kind of analysis will be dragged out every time there's a news story--or better yet, a trial--about an email worm.)

The real moral of this story isn't about innumeracy in the press, or spotlight seekers exploiting said innumeracy. It's about multipliers, and the very real effect they can have.

Suppose you have a decision to make about a particular feature. You can do it the easy way in about a day, or the hard way in about a week. (Hypothetical.) Which way should you do it? Suppose that the easy way makes four new fields required, whereas doing it the hard way makes the program smart enough to handle incomplete data. Which way should you do it?

Required fields seem innocuous, but they are always an imposition on the user. They require the user to gather more information before starting their jobs. This in turn often means they have to keep their data on Post-It notes until they are ready to enter it, resulting in lost data, delays, and general frustration.

Let's consider an analogy. Suppose I'm putting a sign up on my building. Is it OK to mount the sign six feet up on the wall, so that pedestrians have to duck or go around it? It's much easier for me to hang the sign if I don't have to set up a ladder and scaffold. It's only a minor annoyance to the pedestrians. It's not like it would block the sidewalk or anything. All they have to do is duck. So, I get to save an hour installing the sign, at the expense of taking two seconds away from every pedestrian passing my store. Over the long run, all of those two second diversions are going to add up to many, many times more than the hour that I saved.

It's not ethical to worsen the lives of others, even a small bit, just to make things easy for yourself. Successful software is measured in millions of people. Every requirements decision you make is an imposition of your will on your users' lives, even if it is a tiny one. Always be mindful of the impact your decisions--even small ones--have on those people. You should be willing to bear large burdens to ease the burden on those people, even if your impact on any given individual is miniscule.