For as long as organisations have need for data storage, whether using cloud, hosted or enterprise computing environments, behind them you find data centres catering to their needs. They house anywhere from tens to thousands of servers in amongst racks of networking equipment, as well as supporting critical infrastructure systems. AI can be implemented in order to benefit these data centres.
The traditional rules and heuristics of the data centre no longer translate to the running of a modern-day infrastructure. With scale, complexity and all-around optimisation needs growing, the need to move with the times and look towards Artificial Intelligence (AI) has become particularly apparent.
AI can be beneficial in many direct but also unexpected ways. There is a massive amount and variety of data available, from critical infrastructures to internal IT systems, and applications and external environmental changes (e.g. weather patterns) for systems to compute. These factors when analysed and synthesised by AI can accurately provide the best outcomes for ever-increasing availability and optimisation, helping to address SLAs and ultimately minimise operating expenses.
The Need For AI
Many factors are contributing to the need for AI within data centres. Across data centres throughout the world efficiency has become ever more critical. A recent study from the U.S Department of Energy found that a data centre uses up to 50 times more energy per square foot than a standard commercial building, and that data centres consume more than 2 per cent of all electricity in America. With such statistics, operators are now analysing this data using AI to cut costs and consumption to reduce the industry’s energy footprint.
One of the ways they are doing this is through data centre consolidation. As an industry, one of the main avenues of profit is through economies of scale – the more customers, servers, racks etc. an organisation has, the more money they can make. But this, of course, means more physical space and more power (energy). By consolidating facilities and using AI to evaluate a better method of storing data using less hardware, businesses can densify the way they store enormous amounts of data – while lowering power usage.
Another method of consolidation which takes the onus away from organisations is colocation. Colocation providers allow organisations to rent space for servers and other computing hardware with the necessary scale they desire. The beauty of colocation is that their efficiency-driven business models already benefit from, and are thus driving, AI.
There has also been the rise of Edge data centres. These are smaller set-ups which are geographically dispersed, allowing hardware and data to be optimally placed near to need. Rather than being single entities, these Edge sites help build a sizeable cooperative computing fabric by combining with central data centres or cloud computing. And, what’s best, topology like this which provides numerous rich inputs and controls for optimisation and availability is best managed by AI.
Developing AI
A data centre is an ever-changing environment and as it grows so does the need for AI to evolve with it. As such there are several areas where AI is being applied in data centres today:
- Optimising energy usage by managing the numerous types of cooling, room, row and rack, with great precision. It is not uncommon for different cooling systems to conflict with each other with their continual feedback and optimisation algorithms. AI provides an ideal mechanism for managing this complexity. Some of the best and most intriguing examples use weather algorithms to assist predicting and addressing hot spots in the data centre
- Optimising availability by accurately predicting future application behaviour down to the rack and server. Thus, workloads are pre-emptively moved within or across data centres based on future power, thermal or equipment behaviour.
- Multi-variate preventative maintenance, delving into the component level within equipment to predict failure and nip it in the bud
- Intelligently managing alarms and alerts by filtering and prioritising significant events. A common problem in data centres is dealing with chained alerts, making it difficult to address the root cause. AI, when coupled with Change of Rate, deviation or similar algorithms provides an ideal mechanism to identify critical alerts
- Optimising IT equipment placement by forecasting future states of the data centre rather than simply the current configuration
Although AI has countless benefits and has firmly put its foot in the door as a trend within data centres, two points are critical for its continued success and development.
Firstly, AI thrives on large and rich data streams; it’s imperative that the right systems must be in place to collect and compile this data across all parts of the data centre, from IT systems to applications to the critical infrastructure.
Secondly, expectations need to be set for what AI will result in, especially around autonomous control. Amongst the many benefits of AI, one of the main considered is real-time analysis on data streams – not following through with this can play a large part in limiting many of the advantages AI technology provides. This is not an issue of renouncing control but rather putting the appropriate management systems in place to achieve the full enhancement from the technology while still setting limits and boundaries.
Data centres present an ideal use case for AI: complex, energy intensive and critical, with a very large set of inputs and control points that can only be properly managed through an automated system. This is an ideal breeding ground, and with ever-evolving innovations in the data centre, the need for and benefit of AI will only increase.
Enzo Greco is Chief Strategy Officer for Nlyte Software, the leading data centre solution provider for optimizing and automating the management of software and critical infrastructure in data centre and colocation facilities.