The insidious cost of incrementalism — Part 1

March 8th, 2013 | Plexxi | 4 Comments

The networking industry at large is mired in a culture of incrementalism. Put simply, incrementalism is the continued growth through simple addition. You start with an existing product or solution, you identify some problem or deficiency, and you add to the original. Over time, more and more problems are solved, products become more elaborate and complex. Ultimately, the products end up representing the sum total of all the capabilities ever created.

This mode of operation can be quite powerful. It means transitions and migrations are familiar and easier to manage. Deployed solutions become increasingly better over time as new capabilities are introduced (typically through software). Integration with surrounding tools and processes can be leveraged as new products have all the same interfaces, plus a few more.

But there are costs of incrementalism, and because they are hidden, the costs are actually quite insidious.

Over the years, I have held a number of positions in training, operations, product management, strategy, and transformational change management. My roles kept me particularly close to development, so I have seen from the inside how incrementalism can affect an organization.

It starts with feature development. A new product comes out. It has new customers who want their new features. Teams rally around the requests, ultimately delivering all kinds of new capabilities. The engagements between product management and engineering grow increasingly transactional – each new request brings new revenue. The business is growing, life is fantastic.

Over time, the product grows. Foundational elements of the architecture are used in new and different ways, some of which were never planned. New architecture is layered on top of old architecture, new interfaces intermingle with old interfaces, all in a rush to hit the next release and win the next bit of revenue. Over time, the layers of code add entropy to the system, and this leads to a slow deterioration of performance that eventually results in software that is either faulty or unusable. This phenomenon is common enough that it actually has a name (and Wikipedia entry): software rot.

The challenge is that incrementalism is great at adding new things, but it rarely affords us the opportunity to go back and re-architect major facets of the solution. And incrementalism almost never encourages us to look at the collection of stuff that we have accumulated, and retire those things that are no longer necessary or desired. Look at any major product – how often are features deprecated as an act of proper code hygiene? Let’s be honest – this never happens. Because there is never some customer sitting there promising to pay money if you would just remove a feature they don’t use.

And so our products grow. And grow. And grow. They start to resemble Franken-products, a bunch of sometimes unrelated parts cobbled together in a single form factor.

On the vendor side, this can have a profound impact on product development. Let’s ignore the obvious things like increasing complexity and longer development cycles. How does a vendor validate that the features all work? How do you make sure they all work together? You start by working through complex text matrices (feature A works with feature B and feature C). But the combinatorial effect is too huge to stick to that plan. So you start doing unidimensional development and test (just make sure feature D works, forget about testing with A, B, C). Customers reports some bugs, you fix them. They find more. You fix them. They find more and complain. You augment your test environment to test their specific configuration. And you call this scenario or customer use case testing.

But what about the next customer whose deployment is similar but slightly different? So you then group use cases that look similar, and you call these reference use cases or solutions. You test the heck out of it, but you know that even small deviations from the test cases can result in huge swings of testing efficacy. The tests are too brittle to be leveraged. So you add more test engineers and more test beds. Capacity scales linearly with resources, so as you grow the business, you add more and more.

Eventually your sales outpace your spend. Wall Street punishes your stock, so your CFO tells the company that we are all going to be cost conscious. We slow hiring down. We search for efficiencies in the processes and tools. One day, amidst the quality issues, someone stands up and points out that you cannot test quality into a product. Testing finds issues but doesn’t prevent them from being added in the first place.

So you spend more money. Hire some consultants to analyze your development environment. Roll out a new tool. Implement a new process. Train your new engineers. But these fixes take quarters and years to show up. Customers need bugs fixed now. So you increase the percentage of allocated time for bug fixing from 20% to 60%. Features take longer, deals suffer, revenue begins to slow.

One of the architects stands up and points out that the infrastructure has been ignored. The engineering teams rally – it’s true after all! You hire dozens of new engineers, fork off a side project, and re-create your product from scratch. But it takes forever because of feature parity issues. Even now, you refuse to end-of-life anything.

Seven years later, you have a new product that looks an awful lot like your old product. But you have now sunk in nine figures ($100M+) to end up back where you began.

And who pays for these costs? The company still has to make money. So who really pays? Incrementalism comes at a cost, even if we don’t see it outright. Doing something about it starts with dropping some of the baggage that we have been carrying with us for decades and looking first at the goals we are trying to achieve.

4 thoughts on “The insidious cost of incrementalism — Part 1

  1. Well said. This is the “incumbent’s dilemma” commonly called the innovator’s dilemma. The trick is to find a use case that does not need backwards compatibility, but does need radically lower cost or higher capacity or ??. Big Data, and the big data center, seems to fit the bill.

    • mike.bushong says:

      @Fred

      Absolutely correct. If you liked Innovator’s Dilemma (or Innovator’s Solution), you should check out The Strategy Paradox by Michael Raynor. He co-wrote Solution with Christenson, and of the three, I think it had the most profound impact on how I view the world. Well worth the read, even if he does enjoy showing off his enormous vocabulary unnecessarily.

      • Really great post, you really nailed the problems the industry is facing. And I really like how Fred called it the incumbents dilemma – one of the things that has been really fascinating to me wasnt watching how Cisco responded to SDN but rather their tiny competitors who wanted to beat Cisco but when the opportunity for disruption came they all had established businesses with established modes of operation and became intolerant of risk. This is very important for the understanding of capitalism and market equilibrium – it seems weve always just thought that having some level of competition present was suffecient to ensure innovation when clearly it is not – entire ecosystems settle, everyone finds their place and settles, even if they want to change the entire ecosystem must change which aint as simple as thinking the mere presence of competition creates innovation and prevents stagnation. This is a case study on the entire global economy dozens of other industries are in the same predicament and we have to figure a way to plow through.

        • mike.bushong says:

          As the title suggests, I will post up a Part 2 soon (was taken out by Norovirus for a few days so I am behind). The interesting dynamic in networking (and other industries for sure, but definitely not everywhere) is that customers are complicit in this way of thinking. The combination of vendor and customer obedience to incrementalism is astonishingly potent, and I believe it is a huge reason that networking is still living in the 90s while everything else has moved forward by leaps and bounds.

Leave a Reply