/var/blog
/var/blog
The greatest trick the public cloud ever pulled
05.29.17

There is a memorable line from the movie The Usual Suspects. Kevin Spacey’s character Verbal Kint says “The greatest trick the Devil ever pulled was convincing the world he didn’t exist.” 
 
In IT, there is a near equivalent. As the world accelerates towards cloud everything, it seems a near certainty most enterprise workloads will move to the cloud in some form or another. As they do, we will collectively look back on these formative years with awe and wonder at how the major cloud providers managed to convince the world they are “public”.
 
Public as a synonym for “open”
It seems obvious that the modifier public is intended to serve as a synonym for open. The idea that the cloud is public means that workloads that migrate there will not fall under the proprietary control that has plagued much of enterprise IT. Rather, the cloud will serve as an open space, an abstraction of sorts from the underlying systems and software that have conspired to make enterprise IT difficult to evolve.
 
That this labeling is attractive says something about the state of most of enterprise IT. It is a fairly uncontroversial truth that IT has been dogged by proprietary systems, designed out of necessity to solve once unsolvable problems. But as technology has advanced, what was once unsolvable is frequently not even difficult to solve anymore—provided you start from scratch. 
 
I would argue that the hardest IT problems these days are not dealing with the new but rather the old. Legacy applications supported by legacy equipment with legacy features designed to deal with legacy organizations. In the networking world especially, if you told your resident network engineer she didn’t have to worry about any application more than 5 years old, she would probably grin and make plans for how she was going to use the copious free time that was suddenly available.
 
But public isn’t open
Of course, we sort of know by now that the public cloud isn’t really open—at least not in the sense that open means easily interchangeable. This has two profound implications on cloud planning. First, once you pick, it is non-trivial to change. Second, this truth complicates anyone who wants to deploy a multi-cloud environment.
 
On the issue of change, this should give rise to a cloud migration practice. The reseller world is already being shaped by VARs that are pushing cloud rather than traditional equipment. It turns out that if everyone is wondering how to make the move to cloud, companies that position themselves as skilled in managing the transition can provide real value and make a ton of money in the process.
 
And on the issue of multi-cloud, there are probably two truths here. First, with cloud still being nascent for most enterprises, the idea that cloud providers should be treated just as any supplier is still a ways off. When companies realize that cloud providers need to be managed like any other supplier, dual vendor strategies will emerge, and that will lead to cloud arbitrage as a core requirement. Second, the difficulty in migrating between clouds means that the initial instance of this will be more about distributing workloads between vendors rather than running the same workloads in multiple places. 
 
Cloud as a commodity
The cloud providers have behaved very cleverly thus far. If everything moves to commodity (or utility) over time, then during these formative years, it is in the CSPs’ best interests to create hurdles to migration. Once they attract users, they want to make it difficult for those users to leave for better pricing options. 
 
We saw this play out with mobile operators. When, for most people, mobile coverage was largely the same across providers, the thing that kept most people in place was the desire to hold onto the same phone number. Once that barrier was broken, it changed market dynamics to be primarily around marketing and pricing. 
 
In the absence of something that keeps people from moving, the same dynamic will hold true for the cloud.
 
Data as retention
Amazon figured out quite early that people would come for the resources but stay for the data. Migrate whatever workloads you want, and they will provide storage for a mere pittance. From within the cloud, access all the data you want. But if you want to pull that data out and move it somewhere else?
 
To be fair, Google figured this out as well. They have been playing this game with gmail for years. Once you convince people to archive things, the barrier to migration emerges—and it only gets larger over time. 
 
Microsoft has bet that enterprises will ultimately value the data associated with their applications (primarily Office). For digital enterprises, especially that create a lot of content, this is a coherent strategy. Oracle will undoubtedly bet that the data stored by ERP systems is critical. Given the 10s of millions of dollars and high-end consulting help required to upgrade ERP systems, they very well might be right. Once companies put their data in the cloud (typically, a free offering to get entice you), it will take a minor miracle to move out. 
 
For these two, the fact that they have partners, professional services, and expansive enterprise sales teams certainly should help drive adoption.
 
Google building the black hole of data
But the most impressive strategic move, particularly given how late they are to the party, has to be Google. If they can essentially offer their Google Cloud as a means of cultivating machine learning, they will have convinced the world not only to store bits of data but also to willingly redirect virtually all of their data to the cloud. 
 
If that data is used to train models, and those models need to evolve over time, and the only place to do that kind of GPU-intensive work is Google Cloud, what kind of hold will they have?
 
The move is predicated on companies figuring out how to develop and use machine learning—a step beyond the current Digital Transformation a lot of companies are going through now. To encourage the move, they can certainly sell or even give away their machine learning assistance. Algorithms that might have commercial value have a business case that pales in comparison to needing to keep everything in Google Cloud to use them. 
 
And by the way, they are building their own silicon for this and offering researchers the ability to use their Tensor Processing Units (TPUs) for free.
 
The bottom line
The cloud might be public, but moving data around in public requires a mass transit system that simply doesn’t exist right now. The cloud providers have cleverly kept that market expensive. This is going to force companies to be more thoughtful than just “Move our stuff to the cloud. Stat!” Without a data strategy, it could be that companies are trading one set of tyrannical dictators for another. But hey, it’s public.

05.31.17
Juniper Employee

Excellent analysis of what is happening. It's hard and complex to modernize a grown infrastructure. "Move our stuff to the cloud" is not only fashionable but also appears simpler, cheaper, more scalable, more [anything the boss likes to hear]. Curious to see what will be the move 5y down the road when legacy starts biting again.

The challenge is to enable companies which are thoughtful about their data to escape the trap, keep control of their data and data processing.

Top Kudoed Authors
User Kudos Count
29