These days, cloud is one of the most over used terms in the computing industry.
Many products and services that have existed for years have been renamed or refreshed to incorporate the word cloud to benefit from the cache, or the increased stock market value, that the term brings.
Most IT professionals will recognise the National Institute of Standards Technology (NIST) Definition of cloud computing with three services models (Platform-as-a-Service, Infrastructure-as-a-Service and Software-as-a-Service), four deployment models and five essential characteristics.
Both these IT professionals and their business colleagues, will have been inundated with marketing messages, case studies and proof points that reinforce that cloud computing will increase availability, reduce costs and improve the quality of IT.
Hence, any surveys that attempt to understand how widely cloud computing is understood, rather than where, when and how much it is used, will lead IT organisations to believe that there is a deep understanding of cloud computing and no further training or education on the subject is needed.
It’s possible that this superficial understanding based on skewed survey results is the single biggest barrier to more businesses benefiting sooner from the inevitable and clear benefits that cloud computing will ultimately deliver to them.
IT professionals no longer need to understand what cloud computing is and what the benefits are, but instead they need to understand how and what they need to do, to use cloud computing and truly benefit from it.
Organisations that have used Infrastructure-as-a-Service (IaaS), will already know that moving an existing application designed for a physical or virtualised stack will not make it cloud native.
Whereas application availability may be improved, costs might be reduced and the quality of IT might get better.
But none of these are guaranteed, unless the application is specifically written to be able to take advantage of cloud platforms design and operations.
An application that has not previously met user needs, suffered from poor availability or had performance issues, will not become a well behaved application because it’s now deployed into the cloud.
Most importantly, unless organisations understand this, we will inevitably see more examples where organisations with mis-set expectations falsely conclude that cloud computing fails to deliver business value.
Only when applications are cloud native, can organisations be certain they will fully realise all of the benefits of the cloud.
Thus, educating IT professionals and developers on how a cloud native application is architected needs to be prioritised.
Otherwise, new applications, written using traditional techniques will tie organisations into more legacy code and legacy applications for many more years to come.
A cloud native application is designed with micro services and assumes there will be component failure.
It will be loosely coupled and asynchronous; it will use NoSQL databases which scale out, not up, and built to be active/active across zones or data centres.
Designing a cloud native application in this way ensures 100% availability on a cloud platform.
Added to which, the fact that it can dynamically scale will reduce costs, because infrastructure can be turned on and off in line with peaks and troughs in demand and so no longer incur costs when less capacity is needed.
Cloud native applications will also intrinsically offer disaster recovery, and allow applications to be deployed across multiple cloud providers – giving organisations greater choice and reducing supplier risks.
However, application developers also need to be careful when writing their code to a specific cloud platform or cloud provider.
For instance, using Amazon AWS or Microsoft Azure to create cloud native applications, inevitably creates lock in to their proprietary platform, as feature sets are not common across other providers.
This means that running an application across different cloud providers becomes far harder and more expensive.
Hence why many developers first write and run the application on a non-proprietary cloud platform to ensure they do not use the lock in features that are so easily incorporated by developers using Amazon and Microsoft.
The bottom line is, there are countless examples and proof points that show that cloud computing increases availability, reduces costs and improves the quality of IT.
For every example of a successful cloud application there is an example of a legacy application that fails to leverage the characteristics of the cloud.
Hence to truly benefit from the cloud, one must write applications to be cloud native and understand what that means to the developers within the organisation, its methodologies and the approach it takes.
Otherwise, it is easy to propagate yet more legacy technologies that only marginally benefit from the cloud, and having user expectations mis-set as to why the full benefits were not realised.
Simon Hansford, is chief executive of Skyscape Cloud Services