Cloud Computing vs. In-House IT: The Cost Battle

Cloud Computing vs. In-House IT New research shows cloud computing is generally less expensive for dynamic workloads.

It's one of the fundamental questions of cloud computing: Is it less expensive to run workloads in a public cloud than in an on-premises IT environment?

The answer is “a definitive maybe.”

The answer varies depending on individual customers and use cases, but there are some general rules of thumb. For most, it comes down to the basic question of, “How good are you at running your in-house environment?”

Organizations with highly optimized IT shops tailored to their business' needs may find cloud computing to be more expensive. But, if a company has workloads that ebb and flow in their use of compute power, then the cloud can probably yield substantial savings.

Cloud IT infrastructure offerings can yield significant cost reductions for various types and sizes of workloads. But the reality of such savings is highly dependent upon the type of workload, its suitability to cloud, and the level(s) of efficiency and optimization of in-house IT resources and operations.”

The takeaway: Do your homework to find out what this means for your specific use case.

Most major research firms agree that having an optimized, in-house IT department is better for static operations and the cloud is a better deal for dynamic workloads.

In a report last year, Forrester Research said the pay-per-use model of public cloud computing alleviates customers from over-provisioning on-premises resources to handle peak demand in their own IT environments. According to Forrester, when your application load varies and you reserve capacity in advance, the cloud will likely always be a clear cost winner. The Forrester report also has an important caveat: cheaper isn't always better. It states that growing concerns around data security, performance and reliability have left some organizations just more comfortable using their own in-house IT.

There are a couple of important factors IT decision-makers should consider to help determine if cloud computing is a more cost-effective option. While most public cloud-computing resources are pay-per-use, many also come with pre-configured virtual machine image sizes. Amazon Web Services, for example, has dozens of options of virtual machine sizes that can be rolled up and down by the hour. But even with a broad choice of virtual machine sizes, there may not be one that is optimized for the workload that is being run in the cloud.

To address this issue, there is an expanding list of third-party tools that have emerged that help customers optimize their clouds and ensure they are the right size for their applications. There has also been a rise of customizable cloud computing virtual machine sizes from providers.

Hidden costs also become a factor. The cost of a public cloud service is not just the virtual machine per hour price, or the per-gigabyte of storage cost. Data egress and network bandwidth expenses are “gotcha” costs that many customers overlook. In Amazon Web Services' cloud, for example, it's free to upload data to the cloud, but the company charges to take data out, and the network resources that go along with that. Basically, cloud providers want to make their services “sticky.”

However, a cloud is a good solution for many workloads. Having a cloud— either a public one or a private cloud behind-your-firewall— will usually yield savings for a company that has a business model that requires on-demand, elastic resources.

The bigger question, that requires more analysis is whether or not it should be built internally or in a public cloud.

The overall point is that the cloud is essentially another tool in the IT administrator's toolbox; it's good for some use cases, but not ideal for others. The key is knowing when to use it for what, and choosing the right platform and provider. So, don’t expect it to replace traditional IT any time soon.

facebook linkedin twitter google plus
DCJS# 11-4853