We should not be surprised to see history repeat itself. The adoption of cloud technology is no different. The cloud is not new; but our wholesale, blind adoption of it is fairly standard.

It may be human nature that compels us to leap into the pool without first looking. In the days of the PC, we loaded up thousands of desks with IBM and IBM-compatible PCs. When Lotus-123 arrived, thousands of companies migrated sensitive financial data onto floppy disks. Not too long after, we embarked on the era of the LAN which made it easy for us to share expensive printers. This then moved to Wide Area Networks, enabling enterprise-wide sharing and collaboration. Email, which had been around since the late 70’s on UNIX and other multi-user systems became mainstream in the late 80’s and exploded with the advent of the commercial internet in the early 90’s.

In all of these cases, there was one aspect of technology that lagged all of the new innovations. This was management. Whether it was system management or network management, – in all cases the tools needed to manage the tools came after. It was not until things started to go wrong that customers realised there was a need to secure, protect and manage the new technology that had already become mission critical and irreplaceable.

We are doing it again.

In the past two decades, we have embraced the wonders of virtualization and open source, both of which have caused a major technology revolution. Virtualization has allowed us to exploit unused processor and network resources and introduce automation and agility into the server powered the data centre. These technologies are also the seeds of our current cloud market. These virtualized machines have made Amazon, Google and Facebook possible. Can you guess how much Google or Amazon pay Microsoft to operate their infrastructure? If you guessed nothing, you’d be right.

The ability to vastly reduce the cost of delivering compute network capabilities is the root cause behind cloud solutions. This is why it should be cheaper for companies to run applications in the Amazon cloud that would be run in the corporate data centre, but is that true? Is it in all cases cheaper to run in a public cloud?

We as an industry have a habit of leaping before we look. Digital transformation is fueling this, and in an effort to keep up with the competition, conversations similar to the below are taking place.

CEO: “Mr. CIO I want us to move to the cloud.”

CIO: “Yes, and why is that sir?”

CEO: “Because it is cheaper and faster and cooler than what we are doing,”

CIO: “Sir, do you know how much we currently spend or what our current performance levels are?”

CEO: “No. I have no idea, but I am sure they will be better when we get into the Cloud!”

A new paradigm

This is perhaps slightly dramatised but has been taking place in executive suites and boardrooms around the globe. “If the big guys, Google, Facebook and Amazon, are doing it, then it must be good.”

There are, of course, an enormous amount of benefits to utilising virtualization and open source technology, regardless of industry. Incorporating agility and continuous development and deployment into your business strategy is becoming crucial now too. Alongside this, there is a deep need to digitally transform your company in order to keep pace with the market, innovate and win new customers. There is also an easy way to do all of the above wrong.

[easy-tweet tweet=”The value of the cloud is not just cost and performance. It is also about speed and agility” hashtags=”Cloud,Technology”]

The easiest way is to think that this new technology is merely a new way to do what you have been doing, falling victim to what Thomas Kuhn called “Confirmation Bias”: what you know and how you have done things in the past should not dictate future endeavours. Too many companies are attempting what they have always done using a new paradigm, as opposed to thinking about what they need to do to win and then applying new approaches to solve problems.

Digital Transformation asks the question, “How can we better serve our existing customers whilst attract new customers?” The answer to these questions falls under the category of faster innovation. This is where the cloud comes in. The value of the cloud is not just cost and performance. It is also about speed and agility, scaling up and down based on market and customer needs. It is about deploying new ideas quickly, vetting their performance, improving or dismissing them, moving from a failsafe to a safe-to-fail environment. Whether this is done in a public or private setting depends entirely on the business in question.

Back to basics

How do we manage this journey? How do we deliver more predictable outcomes? If we go back to our conversation on what we need in order to make an intelligent decision is exactly that: intelligence.

To determine if our new solution is faster, we need to define speed. To know if our new solution is cheaper, we need to define cost. To know if our new solution is better, we need to define the quality of service. In other words, we need to establish an accurate baseline understanding of our existing environment before we attempt to move it. Think of it this way: will the system you’re running today work the same when you move it into the cloud? Will it cost the same? Will it scale the same?

In order to answer these questions, you are going to need a constant to apply to your existing environment and to your new environment. You are going to want to compare apples to apples so you can tell your team that yes, you have improved performance, yes you have reduced costs and yes you have increased automation and agility and the quality of service experienced by your customers. You will need to think about measurement and metrics, choose an approach that will work in both your existing and future environments, and be able to maintain the constants you expect to rely upon.

In the end, we can all learn from our many mistakes. Knowing what we want to accomplish, how we are working today and what will be an improvement seems to be the bare minimum of an intelligent plan. Before you embark on the cloud journey take careful stock of where you are and have a clear understanding of your desired outcome – Know Before You Go.

+ posts

Jim McNiel joined NETSCOUT in July 2014 to lead worldwide Corporate Marketing activities. He has over 30 years of experience in the technology industry as an entrepreneur, leader, and technologist. Jim was the President and CEO of publicly traded FalconStor Software, the inventor of Virtual Tape Library, and a pioneer in the virtualized storage market. Prior to FalconStor, he was the President and CEO of Fifth Generation Systems. Before this, Jim was General Partner at Pequot Capital for nine years where he invested in and served on the boards of Netegrity, OutlookSoft, NetGear, Asia Online, Bowstreet, and many others. Prior to Pequot, he was Executive Vice President of Corporate Development at Cheyenne Software with responsibilities for business development, OEM sales, Product Management, and Investor Relations. During his tenure at Cheyenne, Jim was also instrumental in the creation of ARCServe, the world’s first client-server backup solution. Prior to Cheyenne, he held engineering and senior management roles at Lucasfilm and AST Research. Jim is a graduate of the Advanced Management Program at the Wharton School of Business.

 

Company link:

http://www.netscout.com/

Unlocking Cloud Secrets and How to Stay Ahead in Tech with James Moore

Newsletter

Related articles

Driving the Future of Connectivity with Data Centres

As an astonishing matter of fact, there are 5.3 billion people...

Willow’s Breakthroughs in Quantum Stability

The Start of a New Era Have you heard the...

Cloud Computing Demands Robust Security Solutions

Modern organisations are increasingly reliant on cloud computing to...

Unlocking the future of manufacturing; AIOps network optimisation

Global manufacturing activity has failed to show signs of...

Why is the hybrid cloud the future of computing?

Have you ever wondered how companies can benefit from...