Can developer productivity measured?

Defining and measuring dev productivity is something of a great white whale in the software industry. It’s the basis of enormous investment, the value proposition of numerous startups, and one of the most difficult parts of an engineering manager or CTO’s job description. It’s also a source of anxiety for developers at all experience levels: how do you know if you’re doing enough, both on and off the clock? When everything you do is intangible, how should you measure it? Can it be measured at all? In this article I’ll discuss the biggest pitfalls of productivity measurement and a few ways to do it well.

In software development, as in any other field, many people think of productivity in terms of inputs and outputs. A full-time developer works 40 hours per week for an average salary of $107,510 per year in the United States. Hours and salary are visible, easily quantifiable inputs. The developer then produces software features, documentation, deployments, and/or bug fixes on a recurring basis. These are outputs. If developers are as simple as the software we imagine they are writing, then increasing their productivity should be as simple as asking them to work more hours or paying them higher salaries. Of course, this is a fairy tale. Neither developers nor software work like that.

The problems of input measurement

“Hours worked” is one of several false metrics used as a proxy for job performance. I mention it first because it’s an oft-unexamined default, a path of least resistance. If a company doesn’t intentionally avoid doing so, it will sooner or later deteriorate into an hours-only environment. Outside of a pandemic where remote work is the norm, the symptoms of an hours-only environment are easy to recognize. Working hours are seen as non-negotiable, and being present at the office is seen as proof that someone is working. Anyone who tries to leave the office a couple hours early is met with hostility (sometimes as muted as a few raised eyebrows, sometimes more brazen). Anyone who works late into the evening or comes in on the weekend is seen as a high performer. The incentives of this “last to leave the gym” culture are unfortunate: developers are pushed to spend more and more of their lives at work, left without any other way to demonstrate their value, and lulled into paying only secondary attention to their work output. As time goes on, the workplace becomes more and more a place where everyone is working but nothing is getting done.

The problems don’t end there. If we assume that all work is “positive work”—that is, that all work represents progress toward a goal—then we are mistaken. Developers who have worked while exhausted, distracted, or sick tend to be familiar with the concept of “negative work”: work so poorly done that it must be undone or compensated for later, thus increasing rather than decreasing the amount of work remaining. Software development is complex, abstract, attentive work, and therefore hypersensitive to a developer’s mental state. That is, there are hidden inputs at play: anxiety, depression, burnout, toxicity at work, grief, microaggressions, and a hundred other things that can reduce or invert individual productivity on any given day. If company culture demands long hours week after week, or even just eight-hour days with no flexibility or vacation time, developers will inevitably spend time doing negative work: they will literally accomplish less by staying late than they would have if they had gone home earlier. And due to fatigue, they’ll accomplish less the next day too.

On the other hand, an hours-only environment is not the worst case scenario. It has a spectre of fairness about it: if two developers are working the same number of hours, there is one clear dimension on which they are equals. Neither of them appears to be slacking off, neither appears to be doing more than their fair share. If they produce less than expected, well, at least they put in their time. And the “hours worked” metric doesn’t explicitly incentivize bad code like some metrics do. So while it’s a poor metric, and even works against productivity in many situations, there are much worse metrics we should discuss.

Consider the other obvious input to software development: money. I have jokingly suggested to my manager once or twice that productivity should be measured by salary, and if my salary were doubled I would produce code at the level of a world-class software architect. Of course, you know intuitively that this is ridiculous. Paying someone more money doesn’t immediately make them more productive (although, indirectly and on a limited scale, it may). Yet, in my mind, money and hours belong to the same category: not just inputs, but auxiliary ones, only tenuously driving productivity. One is given by the employer, the other by the employee, but this exchange is incidental to the creation of useful software.