Showing results for 
Search instead for 
Do you mean 

Disaggregation and the Tyranny of Or

by Trusted Contributor on ‎05-10-2017 12:36 PM

Disaggregation is a common topic in networking. Especially when people discuss cloud economics, the topic of white box comes up. By separating the hardware and software, the general sentiment is that the combined price will drop. 
But the principles of disaggregation are not limited to just hardware and software. There are lots of components that make up the technology stack. And understanding the dynamics driving some of those components will allow people to plan for how this will unfold across the networking industry at large.
Separation by itself isn’t enough
We need to clarify a few misconceptions around disaggregation. First, merely separating components is not going to, by itself, alter the economics of networking. A pure packaging exercise, in isolation, could lead to a change in pricing mix across a solution, but there is nothing fundamental about packaging that necessarily drives pricing down.
In the simplest possible way of thinking, separation requires vendors to attribute value to multiple parts where there was previously a single product. This is a pricing mix exercise. With no other dynamic at play, there just isn’t a reason to drop the price just because something in separated and ordered individually. 
Where does pricing leverage come from?
The number one industry dynamic that impacts pricing is competition. When there are multiple solutions competing for customer dollars, it creates pressure. That pressure can be alleviated by either driving more functionality or by driving down price. Either is a fine strategic decision to make (depending, of course, on the state of the market). 
By deconstructing expansive, tightly-integrated stacks, it allows technology vendors to pick a smaller part of the solution and offer value. In fact, extensive technology stacks, while solving for integration, come at an expense. Their very nature limits competition. This is why virtually every dominant incumbent in any technology space has a strategy to own their space and then to broaden their stack by integrating in adjacent markets. 
The role of customers
The challenge is that customers are dual-minded here: they want solutions that are easier to deploy and operate, and they want pricing relief. The former favors broad solutions with integrated technology. The latter favors disaggregated components where competition is feverish. It’s difficult to know which is the better long-term play.
That said, there are certainly scenarios where customers can avoid the Tyranny of Or (choosing either function or price). And somewhat surprisingly, general competitive dynamics will dictate what that looks like.
Dominant incumbent behavior
If a company enjoys a dominant incumbent position (greater than 50% market share), the strategy gameplay is pretty straightforward. New competitors will attack in two ways: they either hit the low end of the market and compete on price, or they use some technical innovation to drive a superior capability higher in the market. 
As a dominant incumbent, the strategy is simple. First, use your higher product volumes to create a perpetual pricing advantage. So long as you use similar components, you will create a permanent margin advantage, which means you can lower your price more than any of your competitors, provided you are willing to stomach the result. 
Second, if any new technology emerges, you use your war chest to buy it out. The new technology will naturally represent a threat to your existing business, so your business case will always be stronger than a standalone company, which is why dominant incumbents tend to pay so much for hot new tech. 
Chilling effect
This strategy has a naturally chilling effect on competition. It makes it more difficult to get breakout technology that can replace an existing stack. If the goal is to protect existing business, almost every purchase ends up being an integration with the rest of the ecosystem to bring in the competition-damping power of “the whole stack." This means that new technology will almost never live up to the initial promise. By definition, if it did, it would represent a threat to the broader business. 
This is why disaggregation is so important. But it’s not just separation for the sake of separation. The important part here is that to neutralize the long-term competitive dynamics, two things have to be true: the stack has to be disaggregated AND components within a layer have to be interchangeable. If they are disaggregated but not interchangeable, competition doesn’t meaningfully change, and the pricing dynamics never play out. 
Why open matters
And this brings me to why open really matters. Open for the sake of open is interesting but not meaningful. An “open by default” strategy suffers the same issues that any other “by default” strategy does: it lacks nuance. If everything is a nail, then a hammer-only approach is optimal. Of course, we all know IT isn’t all nails. (I know I have seen my fair share of things getting screwed.)
Open is way the industry promotes interchangeability between layers in a disaggregated stack. Being disaggregated requires interfaces, but it’s not until you make interchangeability a requirement that you add the additional constraint of open. In the absence of open interfaces, then you still have a fragile stack where only one option works with another. If a vendor owns both of those, the altered economics never materialize.
It’s worth noting here that “open” can mean a lot of things. While I won’t use this post to dive into all the meanings of open, I will say that it minimally means “open access” (as with a well-documented API layer), and in some cases means “open standard” (where a standard exists, or might exist within a reasonable amount of time). 
The bottom line
The moves described here are tied to how companies compete, not how technology is developed. And while there are external forces that will guide or constrain moves to some extent, most strategies are fairly predictable. Once you know what the outcome will be, the question is really: how do I change the outcome?
The Goldilocks scenario here is likely going to be a relatively small number of solutions providers with the heft to handle integration in a disaggregated world. This gives people the immediate deployability and convenience of an integrated solution. Who will those integrators be? That is yet to be determined. But I would think that this creates a ton of opportunity for a reimagining of the channel and the role of resellers and integrators. 
And, interestingly, it will likely create opportunity for technology vendors with the will and skill to go broad with their solutions and partners. 

Juniper TechCafe Ask the Author
About the Author
  • Mike is currently acting as a Senior Director of Strategic Marketing at Juniper Networks. Mike spent 12 years at Juniper in a previous tour of duty, running product management, strategy, and marketing for Junos Software. In that role, he was responsible for driving Juniper's automation ambitions and incubating efforts across emerging technology spaces (notably SDN, NFV, virtualization, portable network OS, and DevOps). After the first Juniper stint, Mike joined datacenter switching startup Plexxi as the head of marketing. In that role, he was named a top social media personality for SDN. Most recently, Mike was responsible for Brocade's datacenter business as VP of Datacenter Routing and Switching, and then Brocade's software business as VP of Product Management, Software Networking.
About /var/blog

Subscribe to /var/blog  RSS Icon

Follow Mike on Twitter