Ask a few IT pros about the cloud, and chances are you won’t get the same answer from every one. Newer admins might cite public clouds as the way forward, while those of the old guard might grump about the infiltration of consumerization – tablets and smartphones, especially – in their workplace. But a few communal worries emerge: cloud computing costs more than providers claim, and broad adoption will steal IT jobs.
There are two insidious cost myths about cloud computing, and they’re opposites: that the cloud costs far less than traditional IT infrastructure, and that it costs far more. Neither is entirely accurate, but both are used as reasons to put off cloud adoption or shore up local services. According to a recent Dataprise infographic, however, the truth lies somewhere in the middle.
The disconnect for many companies and their number-crunchers lies in the divide between purchase and ownership of physical systems, known as capital expenses, and the costs to maintain these systems, or access subscription-based options, known as operating expenses. Traditional IT wisdom has companies buying every bit of technology infrastructure they need, which results in large, irregular capital expenses as servers require upgrades or replacement. Cloud computing, meanwhile, requires virtually no capital expenses, which often leads to the misconception that it is a cheaper alternative.
This isn’t entirely accurate. In many cases, the operating expense of a cloud can be equal to – or greater – than the capital expense of a server. The value of cloud computing, however, lies in cost over time. Without the need to maintain and upgrade servers, and with the ability to scale infrastructure use as needed, cloud costs decrease sharply after the initial setup, making them at least on par with in-house solutions, and in many cases less expensive.
The Real Takeaway
IT professionals also worry about their job security in a cloud-based world. The myth here goes like this: when someone else manages a company’s data, there’s no need for in-house IT. But that’s a narrow view. A recent CITE World article tells the tale – despite the increasing consumerization of IT, and the trend of business departments circumventing administrator rules to get their work done – IT professionals aren’t being phased out. Their focus simply needs to change.
Instead of focusing on traditional tech problem solving and data management, IT departments are increasingly tasked with things like analyzing big data trends, programming in-house apps and educating staff on technology benefits. These complex, high-level tasks are best kept in-house, and represent what most IT would prefer to do with their time, instead of constantly chasing down data problems. Simply put, the cloud doesn’t steal IT jobs, but instead evolves their purpose.
There’s no question that the cloud comes with new, up-front costs, and can take away some of the “common” tasks often associated to IT admins. Lowered over-time operating expenses, however, combined with a focus on planning technology futures instead of troubleshooting the present help poke holes in both job-loss and money-based cloud myths.
Doug Bonderud is a freelance writer, cloud proponent, business technology analyst and a contributor on the Dataprise website, a provider of Maryland cloud services.