Artificial intelligence goes mainstream? Not before a reality check.

It’s widely expected that machine learning and artificial intelligence will follow the trajectory of cloud technology – breaking into the mainstream and impacting our lives significantly.

Cloud technology is now widely understood among decision makers and operates in an increasingly mature market – a far cry from the days when the concept of the cloud was only understood by a select few technologically-minded individuals. The Cloud Industry Forum (CIF) confirmed this in 2017, polling 250 IT and business decision makers in large corporations, small-to-medium-sized businesses and public-sector organisations. The research determined the UK cloud adoption rate had reached 88 percent.

Yet while cloud technology can now be considered mainstream in most sectors, AI and machine learning still represent unknown quantities to many business decision makers. A number of whom work across sectors that could benefit from adopting nascent technology that can streamline business practices and uncover unique data insights.  For example, Ocado, recently incorporated machine learning to automate the running of their warehouses – streamlining the experience for customers, as well as allowing the organisation to predict demand for products.

However, for this technology to follow the trail to mainstream adoption blazed by cloud technology, the tech sector has serious work to do to increase understanding. A good starting point, and indeed a bone of contention within the tech sector itself, would be to address the difference between the two.

Machine learning and AI – what’s the difference?

The interchangeable use of the phrases AI and machine learning is undoubtedly a barrier to understanding – and one that must be overcome before either concept becomes truly mainstream.

To this end, it’s important to understand that machine learning is a learning process and a sub-field of AI. In fact what is commonly described as AI can be separated into three different sub-fields, machine learning, knowledge representation, and algorithmic approaches.

Machine learning is best described as a statistical method of identifying patterns in datasets in order to make predictions. Knowledge representation allows a system to encode individual pieces of knowledge, and their relationship with others, to create a knowledge base from which to inform actions. Finally, an algorithmic approach involves using dynamic programming as a method of knowledge gathering.

Machine learning in particular can take place in different contexts, be that supervised learning, unsupervised learning or reinforcement learning. Supervised learning uses labelled data to train a machine to predict outputs based on inputs. For example, the number of ice creams sold as a function of outdoor temperature. Unsupervised learning sees machine learning exploit unlabelled data to obtain insights. For example, whether the data can be grouped or associated in any way – highlighting key features within a complex dataset. In simple terms, supervised learning results in a machine producing approximations, or predictions. Unsupervised training results in a machine providing descriptions of the data it is fed.

Reinforcement learning is a process by which a machine learns in real-time interactive environment through trial and error. The process of reinforcement takes place through the use of rewards and punishments, with the machine trying to achieve the goal of maximising the cumulative reward it achieves. This is often demonstrated in an example where an agent tries to complete a game of PacMan – where eating the food is the reward and being killed by a ghost is the punishment. The agent plays the game and, through trial and error, learns which route to take to maximise the reward and eventually win the game.

It is this incarnation of machine learning that is closest to what most people would identify AI. However, it is more accurately described as machine learning. In this context, many decision makers considering how ‘AI’ could transform could more accurately be said to be considering how machine learning could benefit their business.

Providing clarity on this point – and fostering a greater understanding of disruptive technology – should be a key goal in the tech sector. To achieve this fully, however, there is another important challenge facing the tech sector. Myth busting.

Hollywood misconceptions

One of the difficulties around introducing machine learning into the mainstream is that preconceptions about the technology already exist. There is no clean slate from which an understanding of this new technology can be built. Instead, many people already have preconceived ideas fuelled by popular culture.

Within this context, people are often told machine learning will be transformational. That it will disrupt our working practices and have a major impact on our lives. That may be true. The problem is that many people hearing these things haven’t encountered machine learning in a business context. Rather their only exposure to it has been through popular media, where it is entangled within the broader concept of AI.

The result is there is little real public awareness on the capabilities of this technology. Many people believe they have little real-world experience of it, and no reliable reference point to help them understand what the technology can and cannot do. Even those who benefit from the technology, for example by using autocomplete functions when composing text messages or engaging with chatbots, are often unaware of the machine learning technology powering these innovations.  Addressing these misconceptions by raising awareness of real-world application of machine learning, such as predicting sales forecasts, analysing big data and automating factory processes is, therefore, an important necessity.

Making the most of machine learning – and demonstrating results

From a business perspective, it is the responsibility of our sector to build  a greater understanding of machine learning, so that it’s potential can be harnessed as part of business transformation strategies.

Its crucial projects begin with a strong set of objectives and clients are engaged throughout the process. Simplifying the conversation surrounding this innovative technology also allows us to demonstrate tangible results that reflect the objectives of our clients. Working to very specific briefs provides a framework from which to base tangible and measurable results.

An example of this is demonstrated in the work CTS carried out with Leiden University Medical Centre and Google. The brief demanded that administrative load on medical staff be lowered by automating time-consuming processes. By using speech-to-text technology to record the conversations between doctors and patients, and saving all data to the system automatically, this goal was achieved. This system removed the requirement for doctors to manually input data, and has opened the possibility of data analysis of the information provided. Minimising the burden of administrative workloads.

Delivering against these briefs not only demonstrates the value machine learning can deliver but also manages expectations and ensures that new adopters aren’t discouraged by bad experiences or unrealistic expectations.

With so many products that can work in tandem or in competition, it’s clear that jargon must be replaced with straight talking. It’s essential to dispel misconceptions and drive wider take up of emerging technology.

+ posts

Chief Technology Officer at Cloud Technology Solutions

Unlocking Cloud Secrets and How to Stay Ahead in Tech with James Moore

Newsletter

Related articles

Driving the Future of Connectivity with Data Centres

As an astonishing matter of fact, there are 5.3 billion people...

Willow’s Breakthroughs in Quantum Stability

The Start of a New Era Have you heard the...

Cloud Computing Demands Robust Security Solutions

Modern organisations are increasingly reliant on cloud computing to...

Unlocking the future of manufacturing; AIOps network optimisation

Global manufacturing activity has failed to show signs of...

Why is the hybrid cloud the future of computing?

Have you ever wondered how companies can benefit from...