29/05/2018 |

Are Your Cloud Bills Higher than Expected? That’s Probably Your Fault

The cloud and other modern computing models are great, but they are not the solution to all problems. A rational approach to modern IT challenges and technologies would aim to get the best from different solutions while reducing costs and improving the quality of services provided by the IT department.

I recently read this article about the unexpected and rising costs of public cloud deployments. I promptly commented by writing a tweet and noted similar reactions from friends and other IT pros, but I wanted to explore this question in more detail. I am neither for nor against using the public cloud; this is more about the strategy and common sense needed to make things work as intended.

The cloud is just someone else's computer

The public cloud is a tool, like a hammer, and the common saying about problems and hammers applies to it. It is not very productive to look at every problem as if it was a nail, just because the only tool you have is a hammer. And you won’t solve your organizational issues by changing your compute model either. If you were spending money inefficiently before, due to a lack of proper strategy, planning, monitoring, or other reasons, there are many chances that changing the tool will bring only a short term advantage, and all your issues will resurface even more virulently sometime down the road.

The cloud is seen as an easy solution in many large organizations, especially at the beginning. It allows businesses to quickly move costs from CAPEX to OPEX and get rid of "unnecessary" datacenter operators to cut costs. For some, it was a smarter form of outsourcing, with more control over resources and improved flexibility. In recent years, it was not uncommon to hear CIOs or lower-level executives talk about massive migrations to the cloud and all the benefits associated with this change, but they eventually discovered that they were wrong.Learn how Teezily saves 400K €/Year on storage infrastructure with OpenIO SDS

While the cloud is perfect for a large number of workloads and applications, some applications are just too expensive to migrate. And edge computing is quickly growing in popularity because it is much more convenient to compute data where it is created as opposed to sending everything to the cloud and back.

With each new technology wave we have faced people who tried to sell the new approach as the definitive one, as though everything was a greenfield, or there were no constraints. Or, they just implied that we were doing it wrong and it was time to understand that there was a better way. We have seen this many times from mainframe to client-server, to virtualization, then micro services, down to the latest and the greatest, serverless technologies (and we patiently await the next one).

The same is true not just for the computing model, but also for the way it is implemented and consumed (here I mean the private vs. public cloud). When it comes to enterprise IT, every time someone tried to rationalize all their infrastructure on a single underlying technology or computing model, their projects crashed when faced with crude reality and rapidly rising costs.

Learn and understand first

The most important thing to do when faced with a new technology or computing model is to learn and understand its advantages and limits before adopting it. Trying to go too far with a new technology, and underestimating its limits, could cause considerable damage. For example, containers are really cool, but building a persistent database or storage system on top of them is still difficult and problematic. You can do it, and you can make it work, but what price are you going to pay? And you can force the public cloud to work like you want, but is it really how it should work? And how much does it cost? Is it worth it? Is your application designed to take full advantage of the cloud while overcoming its limitations? If not, you are going to pay a lot more than for a traditional deployment. 

Pay-as-you-go or Pay-it-less?

Sometimes, the pay-as-you-go model doesn't work as linearly as expected. It is very nice at the beginning, since you don’t need an initial investment, but all the flexibility provided by the public cloud has a cost, and many organizations, seeing this, took a step back and started re-building private or hybrid infrastructures again.

Last year, one of our customer migrated from AWS S3+EC2 to a hybrid form of cloud built on dedicated servers running OpenIO SDS, and they saved a boatload of money, with even more savings planned for the coming years (here is the full story). The total cost of this type of solution is much more predictable, and even if you can't scale the infrastructure by the minute, it is flexible enough to keep full control of the infrastructure and scale quickly when needed.

Takeaways

The cost of the public cloud depends on how you use it. For example you could buy reserved instances, but what's the point of using the public cloud if you keep resources allocated when you are not using them? Maybe your application isn’t ready to be ported to the cloud.

Every time a new computing model is introduced, it looks like the best we have ever seen. Unfortunately, they all come with huge drawbacks, especially at the beginning when they are immature. And it is so easy to allocate resources in the public cloud that your cloud bill can easily go through the roof.

The latest example comes from serverless. I love serverless, but it is not meant to replace everything (though some pretend otherwise). I'm starting to hear more and more stories about stale functions or how complicated it is to manage thousands of functions instead of hundreds of containers, a process that introduces unforeseen costs and complexity.

I don’t want to suggest that "cloud first" or "serverless first" or "whatever first" strategies are wrong, but maybe "common sense first" is better. Now that the cloud euphoria has ended, we must take the cloud for what it is: another silo in our infrastructure. It could easily become the primary one, but if it is not used in the right way it can also become the most expensive, as many companies have discovered.

A balanced approach gives the best ROI and reduces risks but, even more importantly (and difficult to do), investments in creating the right culture in a company’s IT organization can make a difference in building your IT strategy and using technology effectively.

Learn how Teezily saves 400K €/Year on storage infrastructure with OpenIO SDS