Cloud Repatriation – Why enterprises are bringing some apps back from the public cloud

Organisations leverage on-premise, private cloud, and public cloud infrastructure to move applications and data across all environments – but now a growing number of enterprises are moving apps back home.

Just a few years ago, we believed that the public cloud was the future and would replace physical data centres sooner or later. Since then, the trend to migrate applications and data into public clouds has been strong. However, despite the ongoing trend to use the public cloud, Cloud Repatriation – the decision to move applications and data back home on-premise – has become a trend. As the hybrid cloud environment is becoming the standard for most organisations, there has been a dramatic shift in thinking. From a paradigm that the public cloud is the best place for everything, to a strategy to place applications where they fit best – even including pulling some back from the public cloud. But what is causing the trend of cloud repatriation? In fact, there is quite a long list of factors.

1. The on-premise data centre has evolved into cloud-ready infrastructure

The prerequisite for the trend to repatriate data is that data centres have become increasingly software-defined. Public cloud vendors once triggered this development by building software-defined or automated IT services, creating attractive tooling and interfaces for developers. And those great software-defined technology advances are no longer unique to the public cloud and can now be found all across the computing spectrum: in private clouds, at the edge, at the distributed core, or even as SaaS or managed services, where they offer cloud-like speed, automation or self-service.

This has blurred the line between the data centre and the private cloud even more. Vendors like VMware, AWS, and Azure are even offering a gateway to the public cloud and back with solutions like VMware on AWS, AWS Outpost, and Azure Stack. Enterprises are increasingly starting to use cloud-like infrastructure in their data centre, which now gives them the choice of where to place their applications and data.

2. Data Gravity

Data gravity is another factor, mainly affecting on-premise data storage and the cost and ease of moving it. Applications and data are attracted to each other. And the more data there is, the greater the attractive force pulling applications and services to associate with that data. There is a long list of factors that can affect data gravity, but two factors in particular: network bandwidth and network latency. However, it can also be thought of in terms of network effects more broadly; a large and rich collection of information tends to pull in more and more services that make use of that data. IoT systems and anything else working with large quantities of data need to be designed with that reality in mind as data will continue to grow at the edge of the network because it can’t move.

Storage-drive density is growing exponentially, but moving data to the cloud is getting harder because cable capacity does not grow exponentially. It is hard to generalise how much data is too much to move. It partly relates to costs associated with moving and storing the data, such as network charges. If sending 2 petabytes (PB) across network links to the public cloud is unaffordable today, then sending 5 PB in 12 months’ time will be even more unaffordable, and 10-15 PB, a year later will be nearly impossible. Even with fibre optic networks, it would take years to migrate big data sets somewhere else. That is leading to companies using Edge Computing to process data where it is created and starting to pull some of their data back in-house while it is still possible.

3. Control, security, and compliance

Another main reason for businesses moving certain kind of applications and data away from public cloud is security. At the beginning of the trend to migrate to the public cloud was a misconception that data in the public cloud was 100% protected and secure. In reality, organisations are at risk if they do not architect the right security and data protection solutions. Organisations understand more about what the Public Cloud offers and what it lacks. Bringing data and workloads back in-house can provide better visibility of what exactly is happening and control about security and compliance. GDPR for instance has given organisations a reason to keep their data close as a measure of data sovereignty.

4. Cost

One of the early reasons to move data to the public cloud was a better cost-effectiveness, especially for huge amounts of data for backup and archiving. But as more and more cloud-ready technologies are available in the data centre, the gulf between the two has narrowed, which in turn reduces the cost benefits of the public cloud. In some use cases, on-premise solutions are already more cost-effective than public cloud for the majority of workloads.

The hybrid cloud gives organisations the choice to place their applications and data where they fit best, including in their own data centres. This possibility, paired with rising issues about recent outages, high costs, latency issues and questions regarding control, security, and compliance are leading to the new trend of repatriation of workloads and data from public clouds to private clouds. Another big driver to repatriate applications and data is the increasing issue of data gravity, which will in the future prohibit moving large sets of data due to the growing cost of network transmission.

Above all, enterprises are looking for flexibility to set up solutions and services that are flexible to grow with their business and will not likely commit to either side, on-premise or in the cloud. As businesses evaluate the best IT infrastructure for their workloads, hybrid IT with a mix of public cloud, private cloud and on-premise solutions will become the norm.

These factors are leading to the emergence of application delivery and services technologies that work consistently on different underlying compute infrastructure including bare metal servers, virtual machines, and containers and across private or public clouds. One such architectural pattern – the service mesh – has become popular for applications using microservices and cloud-native architecture, but the concept applies equally well to traditional applications in data centres. With cloud-ready applications spread out over hybrid cloud environments, IT teams need to adopt technologies — like service mesh — that will connect applications to the services they need, wherever they are.

Website | + posts

James Sherlow is Senior Systems Engineer at Avi Networks and heads system engineering efforts for Avi in the UK, Ireland, and South Africa markets. James is an application security and networking expert and has a career that spans several cybersecurity companies including Palo Alto Networks and Netskope Inc. In his role at Avi Networks, James helps multi-national enterprises to modernise their networks and security posture to deliver applications at scale with a high degree of automation.

Unlocking Cloud Secrets and How to Stay Ahead in Tech with James Moore

Newsletter

Related articles

Driving the Future of Connectivity with Data Centres

As an astonishing matter of fact, there are 5.3 billion people...

Willow’s Breakthroughs in Quantum Stability

The Start of a New Era Have you heard the...

Cloud Computing Demands Robust Security Solutions

Modern organisations are increasingly reliant on cloud computing to...

Unlocking the future of manufacturing; AIOps network optimisation

Global manufacturing activity has failed to show signs of...

Why is the hybrid cloud the future of computing?

Have you ever wondered how companies can benefit from...