The bigger big data becomes, the more valuable it gets. Most CIOs understand that the benefits of analytics increase when you collect, store and analyse data from more sources, in greater quantities and without latency. Give a global fashion retailer access to real-time data from its entire store footprint and it can not only react more quickly to new trends but also keep a tighter rein on costs across its supply chain. This is big data working as a strategic asset. 61 per cent of organisations in recent Capgemini research acknowledged that big data is now as valuable as its actual products and services.
[easy-tweet tweet=”61 per cent of organisations acknowledge that big data is now as valuable as their products and services” hashtags=”BigData”]
But the bigger big data becomes, the harder it is to manage. For CIOs, itโs a problem as intractable as it is inevitable. Sooner or later, you run into the laws of physics. If our fashion retailer wants to store all its transactional data in the cloud, enabling the firm to run complex analytics on customer behaviour patterns, the CIO knows all too well that the minute the data exceeds one or two petabytes (incidentally the point at which it becomes really useful) it also becomes immovable.
This is of course great news for the cloud providers. With widely accepted open standards only emerging at a low pace, the illusion of choice for now is cloudโs dirty secret. Once a customer is signed up, switching between providers becomes physically impractical. ย The most cost-effective method of moving a petabyte of data from A to B has long been to rent a room from your cloud provider, transfer the information onto a lorry-load of hard discs, and FedEx them to the dataโs new home. The irony for organisations navigating the transition to digital business is that big data, the Holy Grail of transformation, is simply too big to handle digitally.
the nature of global business operations is that data tends to be generated according to the local environment
So how do global businesses run global analytics on immovable data? The solution is to behave like a mining company. The size of the mine dictates that you take the digger to the mine, not vice versa. If the data is too big to move, you take your analytics to the data.
This is fine in principle, but the nature of global business operations is that data tends to be generated according to the local environment. Our fashion retailer, for example, may have its marketing operations and headquarters in the UK, manufacturing ops in India, and a growing sales market in China. With hundreds of millions of Chinese consumers, the business needs access to as much transactional data as it can lay its hands on. While the Chinese government may decide that any sales data created in China must remain within its borders, given the likely size of its Chinese data set, the retailer will be forced in any case to position its analytics locally.
The problem is that the Chinese arm of the business uses different metadata โ different ways of describing the most important data โ making it very difficult for the analytics to function. The Chinese office may for example use different descriptive terms for fundamental items, such as customer, size, colour, and place of purchase. And as the business grows, the metadata grows and it becomes harder and harder to set data norms that all employees can easily follow. The company may be a global business, but its analytics are operating at a local level.
Itโs becoming increasingly important for multinationals to understand not only the growing importance of big data as a strategic asset, but also the governance required to manage it as an asset.
Big data is only effective in a global organisation if it is treated as a global asset. For our fashion brand to successfully sell jeans to Chinese consumers at scale, it needs to know exactly how many Chinese consumers are buying size 10 following a sales promotion, which colours are most popular, where those sales are occurring and what else customers add to their baskets. This information must travel quickly across the companyโs global supply chain for the company to maximise the sales opportunity. But if the companyโs brand new stock management analytics (created in the UK) canโt read Chinese metadata, the business opportunity will be lost.
Itโs becoming increasingly important for multinationals to understand not only the growing importance of big data as a strategic asset, but also the governance required to manage it as an asset. For data analytics to be successful across the globe, a light touch is required. Itโs up to the CIO โ together with her business peers – to work out the minimum governance required to turn the data into an asset that drives global collaboration. That demands a simple, consistent blueprint, a set of rules that enable collaboration rather than inhibit it. It also requires a governance that mirrors the specific dynamics of the organisation, balancing the needs and realities of global and local.
[easy-tweet tweet=”Big data is only effective in a global organisation if it is treated as a global asset” hashtags=”BigData, cloud”]
Managing big data as a business asset is getting progressively more difficult. Fragmented privacy laws remain difficult for CIOs to work with. Dataโs sheer size means real-time insights can be difficult to generate, while the data itself is often immovable. Yet the biggest barrier holding CIOs back is governance of an increasingly vital business asset. Until organisations can create a global blueprint to enable simple collaboration, theyโll never realise the full value of their investment in big data.
Tony Illingsworth, Vice President, Insights & Data at Capgemini
Tony is the Vice President in Capgeminiโs Insights & Data team, which delivers data governance, data science, analytics, BI and big data. He has been with Capgemini for nearly nine years, having joined from LogicaCMG in June 2007. Tony has a wealth of experience having been involved in the IT industry for over 25 years.