Claireย Buchanan,ย Senior Vice President, Global Operations at Bridgeworks identifies data transfer rates as one of the critical components to successful cloud adoption.
Sitting in a meeting room the other day, our discussion swirling around Big Data and Cloud, as is probably common in most technology companies, we decided to get real and answer the question: โWhat is the barrier stopping enterprises adopting the cloud?โ
We next identified the six pillars, the Six Ssโ, to successful cloud adoption; ย service, speed, scale, security, sovereignty (data) and simplicity. We agreed that security could be overcome and along with service and sovereignty, which was very much down to the choice of geo and provider, the real barriers are speed and scale – and then keeping it simple.
The barrier stopping enterprises adopting the cloud? [The] real barriers are speed and scale – and then keeping it simple.
Clearly speed and scale can be used for many disciplines in the technology business. For the purposes of this blog as well as one of the biggest inhibitors, this is down to the sheer size of the big data challenge and moving data over distance fast, be that from data centre to data centre or customer to host provider. The enormous challenge of getting that data quickly, efficiently and in a ‘performant’ manner from source to host, is the key. The WAN limitations are easy to identify:
- Size of pipe (in this case bigger does not necessarily mean faster), the
- Protocol, and the
- Amount of Data needed to be moved.
We have all heard of those projects that ever so nearly made it to the Cloud Provider but then the time and cost of the transfer of the data made it far too big to contemplate.ย Movement of data on tapes is still commonplace as it is viewed as a safer and faster route in a non-cloud world. From small amounts of data, a private individual trying to upload a movie to Dropbox or a huge Corporation performing a US coast to coast daily batch process, the problem is extremely real. Even the mighty Amazon allows clients to send in their drives in to upload the data to the Cloud such is the problem transferring data across the WAN โ all day every day.
So what do CIOโs want?
Letโs start with the simple stuff – get the cold data (Legacy) off the Primary storage tier, 80% of that data has no real value in day to day business terms but for regulatory, governance and compliance reasons the data has to live somewhere. The challenge therefore is to migrate data out of the production environment and into storage. The cloud offers safe, compliant storage driving down cost and bringing simplification to corporate environments but still the very basic challenge remains โ moving it. Most large organisations want to offload their legacy data, have someone else manage or archive it for them at a fraction of their in-house cost.
Before Cloud we had the philosophy of Write Once Read Many (WORM) but the reality of the cloud is we want to put data that we might not necessarily need, but cannot destroy, somewhere less costly. In this cloud world it is longer WORM, it is very much a write once read possibly (WORP).ย We are not talking warm data here but more often than not cold or even glacial
Ask a simple question, โHow many megabytes per second can your WAN Optimisation product get on a 10Gb link?โ – that should be fun.
All sound simple so far?
For small organisations it is, but for medium to large organisations the elephant in the room is how to move terabytes or even petabytes of data across the WAN. WAN Optimisation I hear you say, that should fix it. Not in a Write Once Read Possibly world (remember WORP?) in order to dedupe the system needs to learn. I could easily slip down the slippery slope, these guys donโt optimise the WAN they simply cut down the size of the files you send across the WAN and this has computational limitations, more Data Optimisation than WAN Optimisation, would you not agree? Ask a simple question, โHow many megabytes per second can your WAN Optimisation product get on a 10Gb link?โ – that should be fun.
What really is needed, and it is not what we know as WAN Optimisation, we need something that moves data at the speed of light because that is the only physical limitation. Something that can move all data at scale (even encrypted or compressed from source – business wants security and encryption is becoming ever more popular) in an unrestricted manner. Remember simplification. It should be a simple to install, say less than an hour and not require any network or storage management time. It should just run on its own, the only intervention to pull off the stats.
The news is good, it can be done. You just need to open your minds to the seemingly impossible. Just to whet your appetite, on a 10 Gb link how does 1 GB in 1 second grab you? Better yet, 1 TB in 16.2 minutes? There are corporations live and in production on 20Gbps basking in the simplicity of it all, not to mention the cost, productivity and continuity savings.ย Just for good measure they can throttle back if they need to limit the percentage of the pipe for other things and still get lightning speed. The barrier to the Cloud for enterprise adoption can be removed.
Andrew McLean is the Studio Director at Disruptive Live, a Compare the Cloud brand. He is an experienced leader in the technology industry, with a background in delivering innovative & engaging live events. Andrew has a wealth of experience in producing engaging content, from live shows and webinars to roundtables and panel discussions. He has a passion for helping businesses understand the latest trends and technologies, and how they can be applied to drive growth and innovation.
Comments are closed.