By Cord Wiggins / April 1, 2019
Is the Cloud Really More Efficient?
IT departments are working to move an increasingly large number of applications and workloads from traditional data centers to the cloud for multiple reasons. First, the cloud offers faster and less expensive provisioning and implementation. Then, there’s the issue of flexibility, mobility and scalability – the cloud has those things baked in. Finally, there is the issue of efficiency.
IT departments traditionally – and smartly – built their IT infrastructures to handle peak demand. They may have even built them to handle more than that to help reduce downtime and to future-proof their infrastructure against growing demand. But, this over-provisioning wasn’t efficient, and IT departments found themselves paying for and building infrastructures that were only partially utilized.
Although numbers vary based on what research you’re citing or who you’re asking, I’d estimate that the average utilization of data centers is somewhere between 20 and 30 percent across enterprises – which equates to significant waste and money effectively flushed down the drain.
The cloud is supposed to fix this. The elasticity of cloud offerings should effectively keep IT departments from having to over-provision their infrastructures. Instead, their cloud implementations can be ramped up and down – when necessary – to handle new workloads, provide development and test environments, and handle peaks and valleys in traffic or network utilization.
But the Gartner report paints a different picture. Gartner’s research points to a situation that may even be less efficient. According to the report, “traditional data center issues of low infrastructure asset utilization and over-provisioning of capacity resources have spilled over into the private cloud environment. As reflected by Gartner client inquiries, many have asset utilization of just 10% to 20%, suggesting significant room for improvement in cost and usage…”
Read more below: