Author: Eric Jensen, VP/GM, Data Aire
Sometimes we hear people comment that we should be thankful to live in such interesting times. However, can there be too much of a good thing? For example, in the tech world things are moving so quickly they’re keeping many of us on our toes and wondering what’s next! Just look at the Cloud Services Industry that is expected to grow 17.5% this year, to $214.3 billion (source). This industry barely existed 10 years ago! Similarly, the Data Center industry is poised for a lot of interesting times in the next 5 years. Hang on, because we have a crazy ride ahead of us. I thought it might be interesting to share a few insights into what is going on behind the scenes to help you plan accordingly.
Gartner & AFCOM Data Center Forecasts
Let’s first take a closer look at what the analysts are saying. According to one extreme forecast from Gartner analyst Dave Cappuccio, 80% of all enterprises will have shut down their traditional data centers by 2025, compared to just 10% in 2018 (source). If correct, there will be a radical transformation of how and where data will be stored, how it will be managed, and what equipment is needed to keep it secure and available as it flows both within and outside of your enterprise. Regardless of how true this forecast proves to be, what’s for sure is that changes are happening with both IT and facilities.
Conversely, analyses from respondents of the latest AFCOM State of the Data Center report expect meaningful increases in ALL data center measures over the next three years. Data center growth looks to be mostly on the upswing across the board. As the report indicates:
- The average number of data centers per organization (including remote sites, computer rooms, clean rooms, and edge) is about 12. This will increase to 13 over the next 12 months and jump to nearly 17 over the next three years.
- Respondents further indicated that, on average, more than four data centers will be built over the course of the next 12 months per organization — and nearly five more over the course of three years. And with this growth comes new requirements and demands around operational and environmental optimization.
Here are 3 factors that help explain why this transformation is occurring, to help you plan according:
1. Cloud Computing’s Impact on Data Center Management
If we take a close look at this Gartner forecast, it soon becomes clear that Cloud computing is a big driver of this transformation. With the expected growth and adoption of this technology, it is changing how and where data is being stored – and how it must be managed. Further, organizations are digitally transforming their operations to become more agile so they can respond faster to change. This factor is also impacting how data centers must be managed. With more transactions now occurring outside, beyond the firewall, the concept of a “closed” computer room or data center is going away. Better collaboration will be required.
2. Growing Clout of the Big 5 Service Providers
While it might have been unclear a few years ago who the market leaders would be in the field of managed cloud services, today the top 5 providers – AWS, Microsoft, IBM, Google and Alibaba – own about half the market (source). Collectively, they’ll earn about $112 billion of revenue from this segment in 2019.
There are a couple of trends that help explain why this transition has occurred:
- Workload placement in a digital infrastructure is primarily based on business need, so is far less constrained by physical location
- Significant cost and software maintenance advantages exist with the cloud, which is accelerating deployment
- As organizations increasingly execute upon their digital transformation strategies, the need to enable scalable, agile organizations has increased to remain competitive; a cloud strategy really helps to enable this transition
3. The Data-driven Organization
Data now plays an increasingly critical role with how enterprise and other organizations are run and operated. Worldwide Big Data market revenue for software and services is projected to increase from $42B in 2018 to $103B in 2027, attaining a Compound Annual Growth Rate (CAGR) of over 10% per year (source). Yet another very interesting number!
Part of what is driving this change is the need for higher performance computing capabilities, running complex applications involving very large volumes of data. A new term has emerged, “data gravity,” which can be seen as an analogy to the way that, in accordance with the physical laws of gravity, objects with more mass attract those with less. As the data that organizations are amassing gets very large, they can’t practically move it so they start hosting applications to process it in the same location. Virtual gravity is now at work, often across several locations, each running mission critical applications with expectations of zero downtime. With this wealth of knowledge and intelligence that is being collected, thanks to the Internet of Things and corporate digital transformation strategies, this data has now become a central part of decision support – starting at the strategic, corporate level. This knowledge is very valuable, as it can be used to gain competitive advantage and deliver a better customer experience.
New Pressure on Data Center Operators
Each of these three market pressures have placed a new burden on infrastructure and data center operators, which are now increasingly placed in the spotlight should connectivity or uptime issues occur. These operators must place greater focus on ensuring that service partner ecosystems are in place that best enable the new requirements of the Cloud computing revolution.
Higher performance applications, computations and queries demand more equipment to support greater data throughput – all generating more heat. As the big 5 cloud service operators continue to grow, new pressure on cost savings will encourage higher density of this equipment, creating an acute need for greater precision with managing temperature and climate conditions. Look for significant new burdens on data center controls (sequence of operation), thermal management, and facilities management overall.
Fortunately, improved engineering strategies and technologies on how to best engineer precision temperature-controlled computing environments now exist. Just as increasing sophistication has enabled the extraction of more data for smarter decision support, so too has the ability to engineer micro-settings to data center temperature ecosystems to ensure temperatures and climate conditions are rigidly adhered too. Maintaining the right operating conditions will be critical across the entire data storage and processing ecosystem – the weakest link will bring down increasingly important business processes with a very high visibility of failure, should it occur. System uptime will increasingly be required to run at perfection, given the higher number of business-critical applications now reliant upon the high value data now collected as part of every business operation.
Plan Now for Interesting Times
With budget planning set to have either just started or begin in the coming weeks, now might be a great time to take a broader perspective of your overall data storage and processing ecosystem performance. A prudent move would be to explore not only what future capabilities and higher standards can be attained internally, but to also look to how you can expand these capabilities in partnership with your service providers to ensure no weak links exist.
With hybrid data computing solutions emerging that can simultaneously take advantage of the extreme scale of the big providers while delivering localized enterprise performance with high reliability for mission critical applications, now is the time to put together an operating plan — and plan for higher performance. However the industry rolls out, the investment in precision environmental monitoring and control has been elevated as a prerequisite to sustain overall enterprise profitability.