IoT growth pushing the data centre to the edge

The IoT has been a dominant trend in recent years, with Gartner estimating more than eight billion connected ‘things’ in use today. But one trend begets another, and in order to manage and process the real-time data that devices and sensors are providing, a new development is emerging in the form of edge computing and its data center use.

[clickToTweet tweet=”To take advantage of both the #IoT and #5G, you need to localise by building your #DataCenter with a closer proximity to #users. This helps to be more efficient; #data is streamed locally and IoT #projects can be deployed more quickly” quote=”To take advantage of both the IoT and 5G, you need to localise by building your data center with a closer proximity to users. This helps to be more efficient; data is streamed locally and IoT projects can be deployed more quickly”]

It’s not just the IoT, 5G technology too is just around the corner, and the demands that both these will place on existing infrastructure will require a massive shift in how compute resource is delivered.

In a sense, we have been here before. Computers were first deployed into businesses using mainframe architecture to provide centralised processing with terminal access. Networked PCs were introduced to deliver more compute power for the end user, with local processing and storage, connected to larger servers for additional storage.

Now, thirty years later, after the significant move towards re-centralising processing into the large single entity that is the cloud, demand is once again increasing for distributed localised processing of data – driven by the increase in traffic, complexity of tasks and our need as consumers for low latency connectivity.

This trend will likely develop further as our homes and personal transport become smarter and create ever more data. This will generate a demand for localised processing where the endpoint is the home and the car, and aggregated data is fed back into regional edge sites for the next layers of processing, so intelligence can be sent back to the core data centres.

If this is the future of the data center, these edge sites need to be delivering continuous real-time data so that no issue is left undetected. If there is a fault or downtime at a site, engineers need to act quickly. A lack of real-time data could mean faults remain unresolved for days or even weeks.

Accuracy of asset location and status data will become even more important when engineers begin to manage multiple dispersed edge sites. Auditing and general maintenance tasks will need to be undertaken in the most efficient way and this will require up-to-date knowledge of each site’s status.

To put the expected growth of edge data centres into context, around 10% of enterprise-generated data is currently created and processed outside a traditional data centre or cloud, but by 2022, Gartner predicts that this figure will reach 50%.

If companies are to take advantage of both the IoT and 5G, they need to localise, and this means designing and building their data center with a closer proximity to users. Not only will this help them to be more efficient because data is only being streamed locally and IoT projects can be deployed more quickly, but it also saves on the costs associated with operating cloud or metro-based facilities.

The Internet of Things will challenge the current cloud-centric view of IT by demanding lower latency and higher reliability for real-time applications. This will shift compute, storage, and real-time analytics closer to the edge, which will in turn drive demand for data center technologies. 451 Research Voice of the Enterprise, IoT, Organizational Dynamics 2017 survey respondents indicate that the leading location for initial IoT analytics will be in data centres (53.9%). This will create opportunities for data center technology suppliers who are able to address this demand with micro-modular and regional or local data centres.

Who will lead the march into edge data centers?

Established colocation companies could capitalise on their market position, buying power and infrastructure knowledge to deploy smaller regional sites. This will mean moving focus from large corporates that require low latency metro-based locations and instead catering to a much larger group of smaller users – all connecting via their data providers.

Alternatively, the data providers themselves – telcos – already have the regionally deployed real-estate to enable this with their telecoms infrastructures and their power, they just need to build ‘the edge’ to support the IoT. Huawei predicts that 20% of telco revenue will come from IoT by 2025.

Apart from these contenders, there are a large group of potential players such as cable operators, who already have established regional infrastructure, and energy providers, many of whom already have data center offerings and could expand into edge compute.

Energy providers may consider this to be a strategic move too given that they will need to add local capability to support IoT enabled smart metering and smart control of domestic appliances. Finally, there may also be a space for the smaller colocation providers to expand and land-grab in this evolving market.

As thousands of edge data centers are constructed and start to become operational, many will be unmanned or employ minimal staff which means careful consideration needs to be given to monitoring each facility’s assets and to their environmental conditions. Given that continuous uptime will be expected, tracking not just the performance of assets, but their locations too will be essential.

Real-time data needs to be delivered that allows companies to assess an asset throughout its lifecycle and environmental monitoring at a granular level that will allow companies to take action immediately when a problem arises. Instant data on temperature, pressure and humidity will reduce potential environmental risks and allow smart decisions to be made and implemented quickly.

Real-time monitoring will be a vital element in the successful deployment of the new edge data centers, delivering a single pane of glass/360° view so that a remote operator can control and manage these mission-critical locations and give companies the reassurance that uptime is guaranteed right to the edge.

+ posts

Peter Vancorenland is the Chief Technology Officer at RF Code, delivering continued innovation, scalability and reliability to the RF Code product suite. With 15 years of experience, Dr. Vancorenland helps RF Code build on its reputation for providing products that reduce data center operating costs and increase operational efficiency. Peter has an M.S. and Ph.D. in Electrical Engineering from KU Leuven, Belgium, and holds two dozen patents.

Unlocking Cloud Secrets and How to Stay Ahead in Tech with James Moore

Newsletter

Related articles

Driving the Future of Connectivity with Data Centres

As an astonishing matter of fact, there are 5.3 billion people...

Willow’s Breakthroughs in Quantum Stability

The Start of a New Era Have you heard the...

Cloud Computing Demands Robust Security Solutions

Modern organisations are increasingly reliant on cloud computing to...

Unlocking the future of manufacturing; AIOps network optimisation

Global manufacturing activity has failed to show signs of...

Why is the hybrid cloud the future of computing?

Have you ever wondered how companies can benefit from...