Fractured automation market: the effect of adoption waves
May 15, 2017
Automation is certainly a top theme in networking. Virtually every networking vendor has automation as a key pillar in their stated strategy. Tools companies, typically oriented around server and application side automation, are seeing traction in the network engineering space. And companies who are adopting cloudy operations practices are filling up conference speaking slots globally.
But tool adoption is not really an industry-wide phenomenon. It more closely mirrors programming languages. There are different flavors, each relevant for different sets of problems. And that means the automation market will be necessarily fractured.
So how does anyone navigate a fractured market?
How vs. what
It’s tempting to think about product adoption in terms of base functionality. When people purchase or implement a particular tool, it’s about the functionality that the tool brings. Except that is a huge simplification, and it misses one of the most important underlying dynamics.
When people adopt new products—from automation tool to full-blown router—they are adopting more than a data sheet. They are adopting the workflows that are required or enabled by the product. And while the functionality of a product might be easily found in competing offerings, the workflows are almost always different.
So when a new thing comes out that is better than the old thing, it has to be so much better that an individual not only changes what they use to do the job but also how they actually do the job.
How much time does a product consume?
When product managers imagine the lives of their customer base, they tend to grossly overstate how much time an individual worker spends with their product. While a product (or class of products) makes up the entirety of a vendor’s world, it is simply one of many deployed products in a customer’s life.
This creates an interesting dynamic. Users spend time with a product, but unless their job is very narrowly defined, they spend a relatively small amount of their time with any one specific product. They are forced to split their time across whatever portfolio of products and tools make up their day-to-day.
If your primary interaction with one of these products is through the workflows required to use it, then you will naturally develop a familiarity with that workflow. And if that workflow is common but not the only thing you do, then you will favor familiarity over anything else. This is largely because, if you have multiple things you do, the effort to context shift can be difficult, especially when it is happening under duress (like when something breaks). So even if an alternative is slightly better, if it brings unfamiliarity with it, it just isn’t that compelling.
Basically, to unseat the incumbent workflow, a new way of doing something has to be more than just slightly better.
Why commonality matters
While not the focus of this particular post, I would be remiss to not point out that this is why things like CLI (and API) commonality matter. If users can take advantage of a common interface and common workflows over a large swath of products for which they are responsible, then the amount of contextual knowledge they have to have in the moment drops substantially.
But be wary. Commonality means commonality. Things that are similar but not the same have the opposite effect. The worst possible situation is to have things that are largely the same but subtly different. In the heat of the moment, that requires users to keep full context, which requires finer-grain familiarity, and raises the risk of human error.
When a new tool comes out, people who tend to be early adopters are the class of first movers. For this persona, the thrill of learning something new or solving a new problem or solving the puzzle of optimization is the reward.
But for the average person concerned with performing their job, adoption for the sake of adoption lacks the payoff. It simply isn’t worth the effort to stay up to date on al the latest things. It’s not satisfying to understand the nth-level differentiation, even if it saves a whopping 20% of the time to do something. How much time would this person put into learning something new to save 20% of the time on a task that makes up only 15% of their job?
So what tends to happen is that new technologies will come forward. They will spawn a bunch of different tools. And users will pick up one of these tools. But once they select a tool, they are more or less locked in.
When first mover isn’t an advantage
This means that markets for these tools will be fractured. And interestingly, the tools that come first in these waves? They won’t fare that well.
When a technology is new, who adopts? The early adopters. But these are the ones who have the least loyalty to one specific tool. The very fact that they are early adopters means that they are the most likely to jump to the next thing, even if it is only marginally better.
Consider the class of provisioning tools that includes Chef and Ansible. In networking circles, the shorthand for those companies doesn’t even include Puppet anymore in many cases, even though Puppet integration was the first target. And CFEngine barely gets a mention despite being early.
(And before I get poked on the example, I understand the move to agent-less. This is just an example.)
Where should the focus be?
If adoption is going to move in waves and this is going to created a fractured market, where should network automation focus?
There are two things that must be true: integratability is as important as integrated, and migration has to be made easier.
On the first, the typical strategy is to go deep on integration. This is useful for the people who use that one integration, but if there are going to be competing tools, it will be important to integrate with several of those. This puts a lot of emphasis on the interfaces through which integration happens. In a network-tool world, this is all about getting structured data on and off a device (or controller, if that is the point of workflow execution).
On the second, the networking industry has an automation consumption problem, not a tools problem. There have been tools for literally decades. That these have been sparingly adopted (and typically only by the most motivated) is a reflection on the difficulty of using them, not on whether or not automation is important to users. Companies that succeed in navigating the waves of user adoption will need to focus on migrating between different operational environments. Whether this is a product or services thing is outside the scope of this blog, but it’s an interesting question.
The bottom line
All of technology moves in waves. And depending on an individual’s comfort with change, they will be earlier or later in those waves. This means that the tooling landscape is going to have several winners. At its most basic, gear that operates within this tooling landscape will need to integrate with many options, which pushes the emphasis beyond mere integration and to the repeatable integratability of the product.
It could be that the right axis on which to judge, assuming a fluid tooling landscape as enterprises cope with cloud and DevOps, is the robustness of the interfaces. And of course on the ability and willingness to quickly integrate, so that whatever comes in the next wave can be accommodating without sacrificing too much of the familiarity that makes network engineering manageable.