It is clear that cloud computing was one of the key innovations which defined the 2010s. The general adoption of the cloud was preceded by several events, starting from the planning of “intergalactic computer network” by U.S. government scientist J.C.R. Licklider in the 1960s. In 2006 cloud computing entered the mainstream technology arena when AWS announced the launch of its Elastic Compute Cloud. However, it wasn’t until the 2010s that cloud computing really started to lead the IT revolution by fully transforming the way computers and software operate. Cloud was adopted by leading brands and made cloud-native companies like Amazon, Netflix and Facebook the largest digital businesses in the world.

In 2010, the cloud computing market was further established when technology giants Microsoft, Google and Amazon Web Services launched their cloud divisions. In the same year, OpenStack, a leading open-source cloud software platform was established, cementing the rise of cloud-enabled technologies.

The definition of cloud computing itself has dramatically changed over the years, with the fast growth of edge and hybrid cloud adoption. Although still consider an early-stage technology, it’s expected in the coming years the development standard for both enterprise and consumer applications will shift from cloud-enabled to cloud-native. According to Statista, public cloud spending had a fivefold increase over the decade, starting at $77 billion and is predicted to reach $411 billion by 2020.

Enterprises will complete the cloud migration

Despite the anticipation of what cloud computing can do for organisations and underpin the modern business infrastructure, a lot of businesses are yet to make the leap. According to Forrester less than half of all enterprises use a public cloud platform right now. However, recent research by 451 Research demonstrated that it is surprisingly, the financial services industry who is a leader in adopting cloud technologies. With increasing competition brought by cloud-native disruptors in the banking arena, 60 percent of financial services companies surveyed reported that implementing cloud technology will be a business priority from next year.

Furthermore, a recent McKinsey survey outlined a big challenge among enterprises to completely migrate their operations to the cloud. There is a big gap between the IT leaders who have migrated over 50 percent of their workload to the cloud, compared those trailing behind with less than 5 percent.

One of the main reasons for slower cloud adoption is security. According to a study by LogicMonitor, two-thirds of IT professionals state that security is the key concern in cloud migration. In the coming years, improving security will be the key cloud industry objective. With new solutions for compliance and data control in place, companies who haven’t moved to the cloud will have less reason not to. Indeed, solutions that answer questions around data control, compliance and user security will be the driving force behind businesses adopting cloud solutions. Solving these questions around security will be the biggest reason for the enterprises to uptake rather than compute requirements.

To conclude, security responsibilities rest mostly with the customers, and more are using cloud visibility and control tools to lower chances of security failures and breaches. Machine learning, predictive analysis and artificial intelligence are set to provide new levels of security and accelerate the number of large-scale, highly distributed deployments. While, there is no computing environment that can guarantee complete and flawless security, yet moving towards 2020 more businesses will likely feel safer working with the cloud than the past decade.

Edge computing will reimagine the cloud

Cloud computing is usually viewed as centralised data centres running thousands of physical servers. However, this perception is missing one of the biggest opportunities brought by cloud computing – distributed cloud infrastructure. As businesses require near-instant access to data and computing resources to serve customers, they are increasingly looking to edge computing.

Edge computing directs specific processes away from centralised data centres to points in the network close to users, devices and sensors. It was described by IDC as a “mesh network of microdata centres that process or store critical data locally and push all received data to a central data centre or cloud storage repository, in a footprint of less than 100 square feet”.

Edge computing is essential for the Internet of Things (IoT), as it requires to collect and process big amounts of data in real-time, with low latency level. Edge computing will help IoT systems to lower connectivity costs, by sending only the most important information to the cloud, as opposed to raw streams of sensor data. For example, a utility with sensors on field equipment can analyse and filter the data prior to sending it and taxing network and computing resources.

Edge computing is not the final stage of cloud computing, but rather a stage in its revolution that is gaining fast adoption across the industries.

Widespread containerisation

Moving to 2020 we’ll continue to see the growing adoption of containers – the technology enabling developers to manage and migrate software code to the cloud. Recent research by Forrester estimates that a third of enterprises are already testing containers for use in software development, while 451 Research forecasts that container market will reach $2.7 billion in 2020 with annual growth increasing to 40 per cent. According to a Cloud Foundry report, more than 50 per cent of organisations are already testing or using containers in development or production. For businesses using multi-cloud infrastructure, containers enable portability between AWS, Azure and Google Cloud, and adjust DevOps strategies to speed up software production.

By using operating-system-level virtualization over hardware virtualization, Kubernetes is becoming the biggest trend in container deployment. It is clear that containers will be less a buzzword and more a widespread development standard as the decade turns.

Serverless gains its momentum

“Serverless computing” is a misleading term in some sense, as applications still run on servers. However, using serverless computing, a cloud provider manages the code execution only when required and charges it only when the code is running. With this technology, businesses no longer have to worry about provisioning and maintaining servers when putting code into production.

Serverless computing gained mainstream popularity back into 2014 when AWS unveiled Lambda during its Reinvent Conference and got further traction recently as AWS announced its open source project Firecracker.  Serverless computing is predicted to be one of the biggest developments in cloud space. Still, not everyone is ready for it. Moving to serverless infrastructure requires an overhaul of traditional development and production paradigm, meaning outsourcing the entire infrastructure.

Serverless computing will not be an overnight sensation. Instead, it will be adopted and developed together with a growing amount of use cases. Current solutions usually lock customers into a specific cloud provider, but the arrival of open source solutions in this space will enable a wider portfolio of implementations or serverless computing across the industry.

Open source is more relevant than ever

Open source software is the most popular it’s ever been. An increasing number of organisations are integrating open source software into their IT operations or even building entire businesses around it. Black Duck Software surveyed IT decision makers and identified that 60 percent of respondents used open source software in their organisations. More than half of the businesses surveyed reported contributing to open source projects.

The cloud provides an ecosystem for open source to thrive. The large amount of open source DevOps tools automation and infrastructure platforms such as OpenStack and Kubernetes is supporting the growing open source adoption.

As organisations continue to migrate their operations to the cloud, open source technologies will boost innovation beyond 2020. With the current development landscape, it is clear that a key momentum for cloud computing is still yet to come.

Website | + posts

Stephan Fabel has ten years of hands-on cloud architecture and product management expertise. Starting with running one of the first production OpenStack data centres; Stephan scoped, designed and managed global cloud implementations at major customers including Apple, AT&T, Verizon, SAP and others. He also led the product management process for hybrid cloud monitoring tools based on the needs identified at largest cloud customers. Currently, Stephan heads the product strategy team at Canonical, the company behind Ubuntu.

Unlocking Cloud Secrets and How to Stay Ahead in Tech with James Moore

Newsletter

Related articles

Driving the Future of Connectivity with Data Centres

As an astonishing matter of fact, there are 5.3 billion people...

Willow’s Breakthroughs in Quantum Stability

The Start of a New Era Have you heard the...

Cloud Computing Demands Robust Security Solutions

Modern organisations are increasingly reliant on cloud computing to...

Unlocking the future of manufacturing; AIOps network optimisation

Global manufacturing activity has failed to show signs of...

Why is the hybrid cloud the future of computing?

Have you ever wondered how companies can benefit from...